Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse
Brand Logo
  1. Home
  2. Uncategorized
  3. A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.

A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.

Scheduled Pinned Locked Moved Uncategorized
sysadminhorrorstoriesithorrorstoriesmonitoring
176 Posts 77 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • Stefano MarinelliS Stefano Marinelli

    A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

    I then suspected a power failure, but the UPS should have sent an alert.

    The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

    To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

    The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

    That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

    The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

    The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

    Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

    Never rely only on internal monitoring. Never.

    #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

    ÁngelA This user is from outside of this forum
    ÁngelA This user is from outside of this forum
    Ángel
    wrote last edited by
    #52
    Oh my 😱
    1 Reply Last reply
    0
    • Stefano MarinelliS Stefano Marinelli

      A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

      I then suspected a power failure, but the UPS should have sent an alert.

      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

      Never rely only on internal monitoring. Never.

      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

      penguin42P This user is from outside of this forum
      penguin42P This user is from outside of this forum
      penguin42
      wrote last edited by
      #53

      @stefano There was an attack a few years back near here where they dropped burning rubbish into manholes around a a data centre; the theory at the time was it was to try and cut off some CCTV or alarm monitoring for something. Well caught!

      1 Reply Last reply
      0
      • Stefano MarinelliS Stefano Marinelli

        A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

        I then suspected a power failure, but the UPS should have sent an alert.

        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

        Never rely only on internal monitoring. Never.

        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

        Space flip-flopsF This user is from outside of this forum
        Space flip-flopsF This user is from outside of this forum
        Space flip-flops
        wrote last edited by
        #54

        @stefano Cool story bro, but it's too fictional, I'd say.
        First off, as a Ukrainian, I know that powerlines can survive "the spikes" by just cutting the power at the very input. No damage to equipment behind the input electric circuit breaker, nope. You just get damaged input.
        Next, I used to work in a bank. And here we had a clear requirement for data storage center: more than one power input -- is a must.

        Space flip-flopsF 1 Reply Last reply
        0
        • Space flip-flopsF Space flip-flops

          @stefano Cool story bro, but it's too fictional, I'd say.
          First off, as a Ukrainian, I know that powerlines can survive "the spikes" by just cutting the power at the very input. No damage to equipment behind the input electric circuit breaker, nope. You just get damaged input.
          Next, I used to work in a bank. And here we had a clear requirement for data storage center: more than one power input -- is a must.

          Space flip-flopsF This user is from outside of this forum
          Space flip-flopsF This user is from outside of this forum
          Space flip-flops
          wrote last edited by
          #55

          @stefano
          Third, given it's a data center, power consumption is probably tens of KW. The "gang" could probably be killed in action playing with it.
          Fourth, if there is a power spike and cut off, it won't go unnoticed by those who control power lines. They will be the first on site to see what happened.

          Space flip-flopsF Stefano MarinelliS 2 Replies Last reply
          0
          • Stefano MarinelliS Stefano Marinelli

            A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

            I then suspected a power failure, but the UPS should have sent an alert.

            The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

            To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

            The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

            That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

            The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

            The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

            Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

            Never rely only on internal monitoring. Never.

            #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

            Conny NaschC This user is from outside of this forum
            Conny NaschC This user is from outside of this forum
            Conny Nasch
            wrote last edited by
            #56

            @stefano thank you for this knowledge, I have boosted it for reference for others. 🤗

            1 Reply Last reply
            0
            • Space flip-flopsF Space flip-flops

              @stefano
              Third, given it's a data center, power consumption is probably tens of KW. The "gang" could probably be killed in action playing with it.
              Fourth, if there is a power spike and cut off, it won't go unnoticed by those who control power lines. They will be the first on site to see what happened.

              Space flip-flopsF This user is from outside of this forum
              Space flip-flopsF This user is from outside of this forum
              Space flip-flops
              wrote last edited by
              #57

              @stefano but otherwise it's a cool horror story, yeah 😃

              1 Reply Last reply
              0
              • Stefano MarinelliS Stefano Marinelli

                @_elena Thank you! Sure, I will 👍
                But, to be honest, I don't think any of those stories will ever be a film.

                The big, most scary one is yet to come, anyway...

                Bob TregilusE This user is from outside of this forum
                Bob TregilusE This user is from outside of this forum
                Bob Tregilus
                wrote last edited by
                #58

                @stefano I don't know, you told this short story like a pro. Starts out, ya, data center suddenly goes dark over the holidays. UPS fails, kinda of ya, ya , still interesting then you introduce the gold, two-meter thick walls, professional thieves, wow, that's some drama! Although, I wonder how they were able to send such a massive power surge down the lines and why the bus mains didn't blow before the equipment was damaged? Looking forward to your next tale!

                @_elena

                Stefano MarinelliS 1 Reply Last reply
                0
                • TwiceBittenB This user is from outside of this forum
                  TwiceBittenB This user is from outside of this forum
                  TwiceBitten
                  wrote last edited by
                  #59

                  @EnigmaRotor @stefano or the case of the red fire button killer

                  1 Reply Last reply
                  0
                  • Stefano MarinelliS Stefano Marinelli

                    A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                    I then suspected a power failure, but the UPS should have sent an alert.

                    The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                    To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                    The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                    That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                    The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                    The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                    Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                    Never rely only on internal monitoring. Never.

                    #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                    G This user is from outside of this forum
                    G This user is from outside of this forum
                    gbsills
                    wrote last edited by
                    #60

                    @stefano thanks for sharing this.

                    1 Reply Last reply
                    0
                    • Stefano MarinelliS Stefano Marinelli

                      A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                      I then suspected a power failure, but the UPS should have sent an alert.

                      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                      Never rely only on internal monitoring. Never.

                      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                      Pedro BufulinP This user is from outside of this forum
                      Pedro BufulinP This user is from outside of this forum
                      Pedro Bufulin
                      wrote last edited by
                      #61
                      @stefano why do you assume that via 4G there would be connectivity? I don't get this part, what am I missing?
                      Stefano MarinelliS 1 Reply Last reply
                      0
                      • Stefano MarinelliS Stefano Marinelli

                        A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                        I then suspected a power failure, but the UPS should have sent an alert.

                        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                        Never rely only on internal monitoring. Never.

                        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                        William Weber BerruttiW This user is from outside of this forum
                        William Weber BerruttiW This user is from outside of this forum
                        William Weber Berrutti
                        wrote last edited by
                        #62

                        @stefano This is a pretty important knowledge to have! I bookmarked for future reference!

                        1 Reply Last reply
                        0
                        • Space flip-flopsF Space flip-flops

                          @stefano
                          Third, given it's a data center, power consumption is probably tens of KW. The "gang" could probably be killed in action playing with it.
                          Fourth, if there is a power spike and cut off, it won't go unnoticed by those who control power lines. They will be the first on site to see what happened.

                          Stefano MarinelliS This user is from outside of this forum
                          Stefano MarinelliS This user is from outside of this forum
                          Stefano Marinelli
                          wrote last edited by
                          #63

                          @fisher It's not a bank, and it's definitely not a top-tier data center. It's just a trading company in a typical industrial area in Northern Italy, with fewer than 50 employees. They've only got four aging servers in a rack. Facilities like this don't need bank-grade security or the kind of resilience you'd find in a war zone. I wish it were fictional! 😃

                          1 Reply Last reply
                          0
                          • Stefano MarinelliS Stefano Marinelli

                            A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                            I then suspected a power failure, but the UPS should have sent an alert.

                            The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                            To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                            The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                            That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                            The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                            The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                            Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                            Never rely only on internal monitoring. Never.

                            #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                            The GafferT This user is from outside of this forum
                            The GafferT This user is from outside of this forum
                            The Gaffer
                            wrote last edited by
                            #64

                            @stefano This immediately brought to mind coming into the office after a holiday weekend in 2005 and finding “my” computer room dark. I found our infrastructure manager, who told me that they had an unexpected power outage over the weekend. Confused, I said “But how is that possible? We have multiple feeds and a huge uninterruptible power supply!”

                            I will never forget his response, delivered in his thick Scottish brogue: “Yes, we do. But it doesn’t do much good when the UPS catches fire.” 😳

                            AsinyA javensbukanJ 2 Replies Last reply
                            0
                            • Bob TregilusE Bob Tregilus

                              @stefano I don't know, you told this short story like a pro. Starts out, ya, data center suddenly goes dark over the holidays. UPS fails, kinda of ya, ya , still interesting then you introduce the gold, two-meter thick walls, professional thieves, wow, that's some drama! Although, I wonder how they were able to send such a massive power surge down the lines and why the bus mains didn't blow before the equipment was damaged? Looking forward to your next tale!

                              @_elena

                              Stefano MarinelliS This user is from outside of this forum
                              Stefano MarinelliS This user is from outside of this forum
                              Stefano Marinelli
                              wrote last edited by
                              #65

                              @elaterite @_elena Fair question 🙂
                              I'm just relaying what I was told and what I know about the company, for which I've been providing some services for many years. The details came directly from their internal manager and, honestly, I didn't have much interest in digging deeper into the technical specifics of the incident.
                              My focus was simply making sure their servers were back up and running and that their data was safe. Everything else, electrical infrastructure, physical security, and similar aspects, is outside of my scope and handled by other people.

                              Bob TregilusE 1 Reply Last reply
                              0
                              • Pedro BufulinP Pedro Bufulin
                                @stefano why do you assume that via 4G there would be connectivity? I don't get this part, what am I missing?
                                Stefano MarinelliS This user is from outside of this forum
                                Stefano MarinelliS This user is from outside of this forum
                                Stefano Marinelli
                                wrote last edited by
                                #66

                                @pedro if the two FTTH providers are down, the router will use the failover 4g connection to reach my VPN (and alert me).

                                indyradioI Pedro BufulinP 2 Replies Last reply
                                0
                                • Stefano MarinelliS Stefano Marinelli

                                  @elaterite @_elena Fair question 🙂
                                  I'm just relaying what I was told and what I know about the company, for which I've been providing some services for many years. The details came directly from their internal manager and, honestly, I didn't have much interest in digging deeper into the technical specifics of the incident.
                                  My focus was simply making sure their servers were back up and running and that their data was safe. Everything else, electrical infrastructure, physical security, and similar aspects, is outside of my scope and handled by other people.

                                  Bob TregilusE This user is from outside of this forum
                                  Bob TregilusE This user is from outside of this forum
                                  Bob Tregilus
                                  wrote last edited by
                                  #67

                                  @stefano Indeed. Still would be interesting to find out about the details of the infrastructure failure and how they pulled it off. Sounds like a good story for a documentary, especially if this is something that has happened in the past.

                                  @_elena

                                  Stefano MarinelliS 1 Reply Last reply
                                  0
                                  • Stefano MarinelliS Stefano Marinelli

                                    A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                    I then suspected a power failure, but the UPS should have sent an alert.

                                    The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                    To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                    The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                    That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                    The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                    The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                    Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                    Never rely only on internal monitoring. Never.

                                    #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                    indyradioI This user is from outside of this forum
                                    indyradioI This user is from outside of this forum
                                    indyradio
                                    wrote last edited by
                                    #68

                                    @stefano that's impressive. meanwhile I accidentally stumbled on your website:
                                    You have shared many useful items in a thoughtful way. I appreciate it, and am glad to let you know. 😀

                                    Stefano MarinelliS 1 Reply Last reply
                                    0
                                    • Stefano MarinelliS Stefano Marinelli

                                      @pedro if the two FTTH providers are down, the router will use the failover 4g connection to reach my VPN (and alert me).

                                      indyradioI This user is from outside of this forum
                                      indyradioI This user is from outside of this forum
                                      indyradio
                                      wrote last edited by
                                      #69

                                      @stefano @pedro power line monitoring is important even for "normal" failures, because some are destructive.
                                      Since 9/11 there are a few new spooky things, and one is modulating the power with pulses

                                      Pedro BufulinP 1 Reply Last reply
                                      0
                                      • Stefano MarinelliS Stefano Marinelli

                                        @bojanlandekic thank you! I'm just trying to spread some real life experiences

                                        Bojan LandekićB This user is from outside of this forum
                                        Bojan LandekićB This user is from outside of this forum
                                        Bojan Landekić
                                        wrote last edited by
                                        #70

                                        @stefano it is the criminals among us who make life difficult for all. Not even the greatest sci-fi authors have been able to imagine how beautiful and fun a future we all would have without them!

                                        1 Reply Last reply
                                        0
                                        • Stefano MarinelliS Stefano Marinelli

                                          A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                          I then suspected a power failure, but the UPS should have sent an alert.

                                          The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                          To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                          The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                          That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                          The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                          The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                          Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                          Never rely only on internal monitoring. Never.

                                          #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                          JimC This user is from outside of this forum
                                          JimC This user is from outside of this forum
                                          Jim
                                          wrote last edited by
                                          #71

                                          @stefano
                                          Wow! Cool story

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups