Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse
Brand Logo
  1. Home
  2. Uncategorized
  3. A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.

A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.

Scheduled Pinned Locked Moved Uncategorized
sysadminhorrorstoriesithorrorstoriesmonitoring
176 Posts 77 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • Bob TregilusE Bob Tregilus

    @stefano I don't know, you told this short story like a pro. Starts out, ya, data center suddenly goes dark over the holidays. UPS fails, kinda of ya, ya , still interesting then you introduce the gold, two-meter thick walls, professional thieves, wow, that's some drama! Although, I wonder how they were able to send such a massive power surge down the lines and why the bus mains didn't blow before the equipment was damaged? Looking forward to your next tale!

    @_elena

    Stefano MarinelliS This user is from outside of this forum
    Stefano MarinelliS This user is from outside of this forum
    Stefano Marinelli
    wrote last edited by
    #65

    @elaterite @_elena Fair question 🙂
    I'm just relaying what I was told and what I know about the company, for which I've been providing some services for many years. The details came directly from their internal manager and, honestly, I didn't have much interest in digging deeper into the technical specifics of the incident.
    My focus was simply making sure their servers were back up and running and that their data was safe. Everything else, electrical infrastructure, physical security, and similar aspects, is outside of my scope and handled by other people.

    Bob TregilusE 1 Reply Last reply
    0
    • Pedro BufulinP Pedro Bufulin
      @stefano why do you assume that via 4G there would be connectivity? I don't get this part, what am I missing?
      Stefano MarinelliS This user is from outside of this forum
      Stefano MarinelliS This user is from outside of this forum
      Stefano Marinelli
      wrote last edited by
      #66

      @pedro if the two FTTH providers are down, the router will use the failover 4g connection to reach my VPN (and alert me).

      indyradioI Pedro BufulinP 2 Replies Last reply
      0
      • Stefano MarinelliS Stefano Marinelli

        @elaterite @_elena Fair question 🙂
        I'm just relaying what I was told and what I know about the company, for which I've been providing some services for many years. The details came directly from their internal manager and, honestly, I didn't have much interest in digging deeper into the technical specifics of the incident.
        My focus was simply making sure their servers were back up and running and that their data was safe. Everything else, electrical infrastructure, physical security, and similar aspects, is outside of my scope and handled by other people.

        Bob TregilusE This user is from outside of this forum
        Bob TregilusE This user is from outside of this forum
        Bob Tregilus
        wrote last edited by
        #67

        @stefano Indeed. Still would be interesting to find out about the details of the infrastructure failure and how they pulled it off. Sounds like a good story for a documentary, especially if this is something that has happened in the past.

        @_elena

        Stefano MarinelliS 1 Reply Last reply
        0
        • Stefano MarinelliS Stefano Marinelli

          A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

          I then suspected a power failure, but the UPS should have sent an alert.

          The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

          To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

          The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

          That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

          The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

          The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

          Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

          Never rely only on internal monitoring. Never.

          #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

          indyradioI This user is from outside of this forum
          indyradioI This user is from outside of this forum
          indyradio
          wrote last edited by
          #68

          @stefano that's impressive. meanwhile I accidentally stumbled on your website:
          You have shared many useful items in a thoughtful way. I appreciate it, and am glad to let you know. 😀

          Stefano MarinelliS 1 Reply Last reply
          0
          • Stefano MarinelliS Stefano Marinelli

            @pedro if the two FTTH providers are down, the router will use the failover 4g connection to reach my VPN (and alert me).

            indyradioI This user is from outside of this forum
            indyradioI This user is from outside of this forum
            indyradio
            wrote last edited by
            #69

            @stefano @pedro power line monitoring is important even for "normal" failures, because some are destructive.
            Since 9/11 there are a few new spooky things, and one is modulating the power with pulses

            Pedro BufulinP 1 Reply Last reply
            0
            • Stefano MarinelliS Stefano Marinelli

              @bojanlandekic thank you! I'm just trying to spread some real life experiences

              Bojan LandekićB This user is from outside of this forum
              Bojan LandekićB This user is from outside of this forum
              Bojan Landekić
              wrote last edited by
              #70

              @stefano it is the criminals among us who make life difficult for all. Not even the greatest sci-fi authors have been able to imagine how beautiful and fun a future we all would have without them!

              1 Reply Last reply
              0
              • Stefano MarinelliS Stefano Marinelli

                A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                I then suspected a power failure, but the UPS should have sent an alert.

                The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                Never rely only on internal monitoring. Never.

                #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                JimC This user is from outside of this forum
                JimC This user is from outside of this forum
                Jim
                wrote last edited by
                #71

                @stefano
                Wow! Cool story

                1 Reply Last reply
                0
                • Bob TregilusE Bob Tregilus

                  @stefano Indeed. Still would be interesting to find out about the details of the infrastructure failure and how they pulled it off. Sounds like a good story for a documentary, especially if this is something that has happened in the past.

                  @_elena

                  Stefano MarinelliS This user is from outside of this forum
                  Stefano MarinelliS This user is from outside of this forum
                  Stefano Marinelli
                  wrote last edited by
                  #72

                  @elaterite @_elena The police are investigating, and I know some technicians are scheduled to go over in the next few days. There will also be an insurance report, so I’ll try to get some more information.

                  1 Reply Last reply
                  0
                  • indyradioI indyradio

                    @stefano that's impressive. meanwhile I accidentally stumbled on your website:
                    You have shared many useful items in a thoughtful way. I appreciate it, and am glad to let you know. 😀

                    Stefano MarinelliS This user is from outside of this forum
                    Stefano MarinelliS This user is from outside of this forum
                    Stefano Marinelli
                    wrote last edited by
                    #73

                    @indyradio thank you!!!

                    1 Reply Last reply
                    0
                    • Stefano MarinelliS Stefano Marinelli

                      A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                      I then suspected a power failure, but the UPS should have sent an alert.

                      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                      Never rely only on internal monitoring. Never.

                      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                      itthinxI This user is from outside of this forum
                      itthinxI This user is from outside of this forum
                      itthinx
                      wrote last edited by
                      #74

                      @stefano Great story and appropriate setup!

                      1 Reply Last reply
                      0
                      • Stefano MarinelliS Stefano Marinelli

                        A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                        I then suspected a power failure, but the UPS should have sent an alert.

                        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                        Never rely only on internal monitoring. Never.

                        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                        Wokebloke for DemocracyD This user is from outside of this forum
                        Wokebloke for DemocracyD This user is from outside of this forum
                        Wokebloke for Democracy
                        wrote last edited by
                        #75

                        @stefano
                        Hey! Thanks for the inside story! I love happy endings.

                        1 Reply Last reply
                        0
                        • indyradioI indyradio

                          @stefano @pedro power line monitoring is important even for "normal" failures, because some are destructive.
                          Since 9/11 there are a few new spooky things, and one is modulating the power with pulses

                          Pedro BufulinP This user is from outside of this forum
                          Pedro BufulinP This user is from outside of this forum
                          Pedro Bufulin
                          wrote last edited by
                          #76
                          @indyradio @stefano modulating power with pulses? What is that? How does that work? What does it achieve?

                          I have so many questions...
                          Honestly, I know nothing about electrical wizzardry, I went too deep into computer science and never really touched that layer much.
                          indyradioI 1 Reply Last reply
                          0
                          • Stefano MarinelliS Stefano Marinelli

                            @pedro if the two FTTH providers are down, the router will use the failover 4g connection to reach my VPN (and alert me).

                            Pedro BufulinP This user is from outside of this forum
                            Pedro BufulinP This user is from outside of this forum
                            Pedro Bufulin
                            wrote last edited by
                            #77
                            @stefano how do you think they managed to burn 4G? I suppose the battery for 4G should not even be in the same "grid" as the other stuff, right? (Im not sure anymore if I know how electricity works, guess I always took it for granted)
                            Stefano MarinelliS 1 Reply Last reply
                            0
                            • Stefano MarinelliS Stefano Marinelli

                              A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                              I then suspected a power failure, but the UPS should have sent an alert.

                              The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                              To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                              The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                              That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                              The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                              The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                              Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                              Never rely only on internal monitoring. Never.

                              #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                              Ondrej ZizkaO This user is from outside of this forum
                              Ondrej ZizkaO This user is from outside of this forum
                              Ondrej Zizka
                              wrote last edited by
                              #78

                              @stefano Thanks for all the info about the company's internal setup.

                              Stefano MarinelliS 1 Reply Last reply
                              0
                              • Stefano MarinelliS Stefano Marinelli

                                A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                I then suspected a power failure, but the UPS should have sent an alert.

                                The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                Never rely only on internal monitoring. Never.

                                #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                George E. 🇺🇸♥🇺🇦🇵🇸🏳️‍🌈🏳️‍⚧️G This user is from outside of this forum
                                George E. 🇺🇸♥🇺🇦🇵🇸🏳️‍🌈🏳️‍⚧️G This user is from outside of this forum
                                George E. 🇺🇸♥🇺🇦🇵🇸🏳️‍🌈🏳️‍⚧️
                                wrote last edited by
                                #79

                                @stefano@mastodon.bsd.cafe
                                In the critical infrastructure sector controls are designed to fail open (as in break the circuit) and monitoring systems all have watchdog timers. If an "I'm still here!" ping is not received when it's expected to be received, an alarm goes off.

                                I say this not to distract from your original point.

                                External monitoring for critical systems is a must.

                                1 Reply Last reply
                                0
                                • Pedro BufulinP Pedro Bufulin
                                  @stefano how do you think they managed to burn 4G? I suppose the battery for 4G should not even be in the same "grid" as the other stuff, right? (Im not sure anymore if I know how electricity works, guess I always took it for granted)
                                  Stefano MarinelliS This user is from outside of this forum
                                  Stefano MarinelliS This user is from outside of this forum
                                  Stefano Marinelli
                                  wrote last edited by
                                  #80

                                  @pedro the 4g router was connected to the same UPS. So it wasn't destroyed, just off.

                                  1 Reply Last reply
                                  0
                                  • Ondrej ZizkaO Ondrej Zizka

                                    @stefano Thanks for all the info about the company's internal setup.

                                    Stefano MarinelliS This user is from outside of this forum
                                    Stefano MarinelliS This user is from outside of this forum
                                    Stefano Marinelli
                                    wrote last edited by
                                    #81

                                    @OndrejZizka I never named the company 😉

                                    1 Reply Last reply
                                    0
                                    • Stefano MarinelliS Stefano Marinelli

                                      A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                      I then suspected a power failure, but the UPS should have sent an alert.

                                      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                      Never rely only on internal monitoring. Never.

                                      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                      josephM This user is from outside of this forum
                                      josephM This user is from outside of this forum
                                      joseph
                                      wrote last edited by
                                      #82

                                      @stefano And while not relying on internal monitoring make sure your external monitoring doesn't share anything with the monitored systems:

                                      Different ISP, different cloud provider if in the cloud, no shared infra at any level

                                      1 Reply Last reply
                                      0
                                      • Stefano MarinelliS Stefano Marinelli

                                        A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                        I then suspected a power failure, but the UPS should have sent an alert.

                                        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                        Never rely only on internal monitoring. Never.

                                        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                        SharquaydiusS This user is from outside of this forum
                                        SharquaydiusS This user is from outside of this forum
                                        Sharquaydius
                                        wrote last edited by
                                        #83

                                        @stefano zapping the power lines, eh? Looks like the perfect solution to my nuisance neighbors with the big loudspeakers.

                                        1 Reply Last reply
                                        0
                                        • Stefano MarinelliS Stefano Marinelli

                                          A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                          I then suspected a power failure, but the UPS should have sent an alert.

                                          The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                          To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                          The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                          That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                          The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                          The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                          Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                          Never rely only on internal monitoring. Never.

                                          #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                          Dan 🌻D This user is from outside of this forum
                                          Dan 🌻D This user is from outside of this forum
                                          Dan 🌻
                                          wrote last edited by
                                          #84

                                          @stefano The true horror part of this story:

                                          > The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                          Home for the holidays, sick, serious family issue?? Who cares! You know what's more important?? Keeping that data center up and running!

                                          Glory to sacrificing yourself for the system!!

                                          Or maybe get someone else next time.

                                          Stefano MarinelliS SharquaydiusS 2 Replies Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups