Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse
Brand Logo
  1. Home
  2. Uncategorized
  3. A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.

A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.

Scheduled Pinned Locked Moved Uncategorized
sysadminhorrorstoriesithorrorstoriesmonitoring
176 Posts 77 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • Stefano MarinelliS Stefano Marinelli

    A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

    I then suspected a power failure, but the UPS should have sent an alert.

    The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

    To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

    The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

    That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

    The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

    The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

    Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

    Never rely only on internal monitoring. Never.

    #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

    Lasse LeegaardL This user is from outside of this forum
    Lasse LeegaardL This user is from outside of this forum
    Lasse Leegaard
    wrote last edited by
    #42

    @stefano 10+ years ago i started volunteering at a festival. Everything was new that year including the small outdoor racks for the area field routers (Juniper MX80). They barely fit but we managed. The racks were left in the sun in the summer. It was only when we enabled Observium (LibreNMS predecessor) that graphs almost everything it gets from SNMP that we discovered the inlet temperature was getting close to 80 degrees C. #monitorallthethings

    Lasse LeegaardL 1 Reply Last reply
    0
    • Lasse LeegaardL Lasse Leegaard

      @stefano 10+ years ago i started volunteering at a festival. Everything was new that year including the small outdoor racks for the area field routers (Juniper MX80). They barely fit but we managed. The racks were left in the sun in the summer. It was only when we enabled Observium (LibreNMS predecessor) that graphs almost everything it gets from SNMP that we discovered the inlet temperature was getting close to 80 degrees C. #monitorallthethings

      Lasse LeegaardL This user is from outside of this forum
      Lasse LeegaardL This user is from outside of this forum
      Lasse Leegaard
      wrote last edited by
      #43

      @stefano since the racks were designed for outdoor use they were water tight, only had small holes in the bottom for cables and very limited infrastructure for air venting like downward facing holes in the “roof”. They could supposedly float.

      Lasse LeegaardL 1 Reply Last reply
      0
      • Lasse LeegaardL Lasse Leegaard

        @stefano since the racks were designed for outdoor use they were water tight, only had small holes in the bottom for cables and very limited infrastructure for air venting like downward facing holes in the “roof”. They could supposedly float.

        Lasse LeegaardL This user is from outside of this forum
        Lasse LeegaardL This user is from outside of this forum
        Lasse Leegaard
        wrote last edited by
        #44

        @stefano We ended up cutting some wide cable pipes at an angle and duct taping it to the router so we covered the air inlet with one pipe and the air exhaust with another pipe. The other end of the ducts were led to the outside of the rack, lifted off the ground and pointed downwards to avoid water. That provided new fresh air and a way to get rid of the hot air. We also fashioned some shadow with a sheet of plywood. The year after we put some smaller equipment in 😎

        1 Reply Last reply
        0
        • Rob\Viewdata UKR This user is from outside of this forum
          Rob\Viewdata UKR This user is from outside of this forum
          Rob\Viewdata UK
          wrote last edited by
          #45

          @darkling @stefano
          Ferranti Computer Systems, Cheadle (UK) circa 1982. I was a lowly apprentice, at the time working in the department that oversaw the various VAXen that most of the site used. Three full size machines and a handful of microVAX. Kept cool by *three* massive air conditioner units on the external wall. The server room was always chilly. /cont

          Rob\Viewdata UKR 1 Reply Last reply
          0
          • Stefano MarinelliS Stefano Marinelli

            A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

            I then suspected a power failure, but the UPS should have sent an alert.

            The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

            To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

            The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

            That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

            The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

            The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

            Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

            Never rely only on internal monitoring. Never.

            #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

            rasteriR This user is from outside of this forum
            rasteriR This user is from outside of this forum
            rasteri
            wrote last edited by
            #46

            @stefano I wonder how they generate a big enough power surge.

            Falk AppelF 1 Reply Last reply
            0
            • Elena Rossini ⁂_ Elena Rossini ⁂

              @EnigmaRotor reading this at lunch in a cafe near my house and I keep chuckling and smiling from ear to ear. @stefano is such a treasure 🙌🏆

              EnigmaRotorE This user is from outside of this forum
              EnigmaRotorE This user is from outside of this forum
              EnigmaRotor
              wrote last edited by
              #47

              @_elena @stefano And the café is the treasure island (“X” marks the place). 🎶“Heeeeee is a pirate, a jar of whiskey and a bottle of winnnneeeeee”🎶. Well that was a spontaneous Jack Sparrow moment. Sorry!

              1 Reply Last reply
              0
              • Rob\Viewdata UKR Rob\Viewdata UK

                @darkling @stefano
                Ferranti Computer Systems, Cheadle (UK) circa 1982. I was a lowly apprentice, at the time working in the department that oversaw the various VAXen that most of the site used. Three full size machines and a handful of microVAX. Kept cool by *three* massive air conditioner units on the external wall. The server room was always chilly. /cont

                Rob\Viewdata UKR This user is from outside of this forum
                Rob\Viewdata UKR This user is from outside of this forum
                Rob\Viewdata UK
                wrote last edited by
                #48

                @darkling @stefano
                Until one morning I arrived to chaos. One of the aircons had failed, & the others, overstressed, had completely iced up, and the reduced airflow had caused the temperature in the room to rise. It was pretty much the hottest I'd ever encountered anywhere!
                Fire doors and internal doors were propped open, to get a bit of airflow, and the blocked air cons turned off. The heat then had a chance to melt the ice, and they could be brought back online later. I think it took all day.

                1 Reply Last reply
                0
                • James SewardJ James Seward

                  @mkj @stefano @rhoot oh if audio's getting involved, you can use `ping -a` 😄

                  mkjM This user is from outside of this forum
                  mkjM This user is from outside of this forum
                  mkj
                  wrote last edited by
                  #49

                  @jamesoff `ping -af` 🙂

                  @stefano @rhoot

                  James SewardJ 1 Reply Last reply
                  0
                  • mkjM mkj

                    @jamesoff `ping -af` 🙂

                    @stefano @rhoot

                    James SewardJ This user is from outside of this forum
                    James SewardJ This user is from outside of this forum
                    James Seward
                    wrote last edited by
                    #50

                    @mkj @stefano @rhoot "i don't even see the pings any more... it's just blonde, brunette, airhorn"

                    1 Reply Last reply
                    0
                    • rasteriR rasteri

                      @stefano I wonder how they generate a big enough power surge.

                      Falk AppelF This user is from outside of this forum
                      Falk AppelF This user is from outside of this forum
                      Falk Appel
                      wrote last edited by
                      #51

                      @rasteri probably easy you just need a big capacitor and a tape generator (that thing from physics in school) and woossh enough voltage and current to melt e.g. a screwdriver (did that in school 😅) @stefano

                      1 Reply Last reply
                      0
                      • Stefano MarinelliS Stefano Marinelli

                        A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                        I then suspected a power failure, but the UPS should have sent an alert.

                        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                        Never rely only on internal monitoring. Never.

                        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                        ÁngelA This user is from outside of this forum
                        ÁngelA This user is from outside of this forum
                        Ángel
                        wrote last edited by
                        #52
                        Oh my 😱
                        1 Reply Last reply
                        0
                        • Stefano MarinelliS Stefano Marinelli

                          A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                          I then suspected a power failure, but the UPS should have sent an alert.

                          The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                          To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                          The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                          That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                          The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                          The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                          Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                          Never rely only on internal monitoring. Never.

                          #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                          penguin42P This user is from outside of this forum
                          penguin42P This user is from outside of this forum
                          penguin42
                          wrote last edited by
                          #53

                          @stefano There was an attack a few years back near here where they dropped burning rubbish into manholes around a a data centre; the theory at the time was it was to try and cut off some CCTV or alarm monitoring for something. Well caught!

                          1 Reply Last reply
                          0
                          • Stefano MarinelliS Stefano Marinelli

                            A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                            I then suspected a power failure, but the UPS should have sent an alert.

                            The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                            To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                            The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                            That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                            The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                            The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                            Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                            Never rely only on internal monitoring. Never.

                            #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                            Space flip-flopsF This user is from outside of this forum
                            Space flip-flopsF This user is from outside of this forum
                            Space flip-flops
                            wrote last edited by
                            #54

                            @stefano Cool story bro, but it's too fictional, I'd say.
                            First off, as a Ukrainian, I know that powerlines can survive "the spikes" by just cutting the power at the very input. No damage to equipment behind the input electric circuit breaker, nope. You just get damaged input.
                            Next, I used to work in a bank. And here we had a clear requirement for data storage center: more than one power input -- is a must.

                            Space flip-flopsF 1 Reply Last reply
                            0
                            • Space flip-flopsF Space flip-flops

                              @stefano Cool story bro, but it's too fictional, I'd say.
                              First off, as a Ukrainian, I know that powerlines can survive "the spikes" by just cutting the power at the very input. No damage to equipment behind the input electric circuit breaker, nope. You just get damaged input.
                              Next, I used to work in a bank. And here we had a clear requirement for data storage center: more than one power input -- is a must.

                              Space flip-flopsF This user is from outside of this forum
                              Space flip-flopsF This user is from outside of this forum
                              Space flip-flops
                              wrote last edited by
                              #55

                              @stefano
                              Third, given it's a data center, power consumption is probably tens of KW. The "gang" could probably be killed in action playing with it.
                              Fourth, if there is a power spike and cut off, it won't go unnoticed by those who control power lines. They will be the first on site to see what happened.

                              Space flip-flopsF Stefano MarinelliS 2 Replies Last reply
                              0
                              • Stefano MarinelliS Stefano Marinelli

                                A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                I then suspected a power failure, but the UPS should have sent an alert.

                                The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                Never rely only on internal monitoring. Never.

                                #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                Conny NaschC This user is from outside of this forum
                                Conny NaschC This user is from outside of this forum
                                Conny Nasch
                                wrote last edited by
                                #56

                                @stefano thank you for this knowledge, I have boosted it for reference for others. 🤗

                                1 Reply Last reply
                                0
                                • Space flip-flopsF Space flip-flops

                                  @stefano
                                  Third, given it's a data center, power consumption is probably tens of KW. The "gang" could probably be killed in action playing with it.
                                  Fourth, if there is a power spike and cut off, it won't go unnoticed by those who control power lines. They will be the first on site to see what happened.

                                  Space flip-flopsF This user is from outside of this forum
                                  Space flip-flopsF This user is from outside of this forum
                                  Space flip-flops
                                  wrote last edited by
                                  #57

                                  @stefano but otherwise it's a cool horror story, yeah 😃

                                  1 Reply Last reply
                                  0
                                  • Stefano MarinelliS Stefano Marinelli

                                    @_elena Thank you! Sure, I will 👍
                                    But, to be honest, I don't think any of those stories will ever be a film.

                                    The big, most scary one is yet to come, anyway...

                                    Bob TregilusE This user is from outside of this forum
                                    Bob TregilusE This user is from outside of this forum
                                    Bob Tregilus
                                    wrote last edited by
                                    #58

                                    @stefano I don't know, you told this short story like a pro. Starts out, ya, data center suddenly goes dark over the holidays. UPS fails, kinda of ya, ya , still interesting then you introduce the gold, two-meter thick walls, professional thieves, wow, that's some drama! Although, I wonder how they were able to send such a massive power surge down the lines and why the bus mains didn't blow before the equipment was damaged? Looking forward to your next tale!

                                    @_elena

                                    Stefano MarinelliS 1 Reply Last reply
                                    0
                                    • TwiceBittenB This user is from outside of this forum
                                      TwiceBittenB This user is from outside of this forum
                                      TwiceBitten
                                      wrote last edited by
                                      #59

                                      @EnigmaRotor @stefano or the case of the red fire button killer

                                      1 Reply Last reply
                                      0
                                      • Stefano MarinelliS Stefano Marinelli

                                        A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                        I then suspected a power failure, but the UPS should have sent an alert.

                                        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                        Never rely only on internal monitoring. Never.

                                        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                        G This user is from outside of this forum
                                        G This user is from outside of this forum
                                        gbsills
                                        wrote last edited by
                                        #60

                                        @stefano thanks for sharing this.

                                        1 Reply Last reply
                                        0
                                        • Stefano MarinelliS Stefano Marinelli

                                          A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                          I then suspected a power failure, but the UPS should have sent an alert.

                                          The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                          To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                          The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                          That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                          The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                          The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                          Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                          Never rely only on internal monitoring. Never.

                                          #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                          Pedro BufulinP This user is from outside of this forum
                                          Pedro BufulinP This user is from outside of this forum
                                          Pedro Bufulin
                                          wrote last edited by
                                          #61
                                          @stefano why do you assume that via 4G there would be connectivity? I don't get this part, what am I missing?
                                          Stefano MarinelliS 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups