Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse
Brand Logo
  1. Home
  2. Uncategorized
  3. A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.

A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.

Scheduled Pinned Locked Moved Uncategorized
sysadminhorrorstoriesithorrorstoriesmonitoring
176 Posts 77 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • indyradioI indyradio

    @stefano that's impressive. meanwhile I accidentally stumbled on your website:
    You have shared many useful items in a thoughtful way. I appreciate it, and am glad to let you know. 😀

    Stefano MarinelliS This user is from outside of this forum
    Stefano MarinelliS This user is from outside of this forum
    Stefano Marinelli
    wrote last edited by
    #73

    @indyradio thank you!!!

    1 Reply Last reply
    0
    • Stefano MarinelliS Stefano Marinelli

      A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

      I then suspected a power failure, but the UPS should have sent an alert.

      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

      Never rely only on internal monitoring. Never.

      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

      itthinxI This user is from outside of this forum
      itthinxI This user is from outside of this forum
      itthinx
      wrote last edited by
      #74

      @stefano Great story and appropriate setup!

      1 Reply Last reply
      0
      • Stefano MarinelliS Stefano Marinelli

        A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

        I then suspected a power failure, but the UPS should have sent an alert.

        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

        Never rely only on internal monitoring. Never.

        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

        Wokebloke for DemocracyD This user is from outside of this forum
        Wokebloke for DemocracyD This user is from outside of this forum
        Wokebloke for Democracy
        wrote last edited by
        #75

        @stefano
        Hey! Thanks for the inside story! I love happy endings.

        1 Reply Last reply
        0
        • indyradioI indyradio

          @stefano @pedro power line monitoring is important even for "normal" failures, because some are destructive.
          Since 9/11 there are a few new spooky things, and one is modulating the power with pulses

          Pedro BufulinP This user is from outside of this forum
          Pedro BufulinP This user is from outside of this forum
          Pedro Bufulin
          wrote last edited by
          #76
          @indyradio @stefano modulating power with pulses? What is that? How does that work? What does it achieve?

          I have so many questions...
          Honestly, I know nothing about electrical wizzardry, I went too deep into computer science and never really touched that layer much.
          indyradioI 1 Reply Last reply
          0
          • Stefano MarinelliS Stefano Marinelli

            @pedro if the two FTTH providers are down, the router will use the failover 4g connection to reach my VPN (and alert me).

            Pedro BufulinP This user is from outside of this forum
            Pedro BufulinP This user is from outside of this forum
            Pedro Bufulin
            wrote last edited by
            #77
            @stefano how do you think they managed to burn 4G? I suppose the battery for 4G should not even be in the same "grid" as the other stuff, right? (Im not sure anymore if I know how electricity works, guess I always took it for granted)
            Stefano MarinelliS 1 Reply Last reply
            0
            • Stefano MarinelliS Stefano Marinelli

              A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

              I then suspected a power failure, but the UPS should have sent an alert.

              The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

              To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

              The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

              That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

              The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

              The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

              Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

              Never rely only on internal monitoring. Never.

              #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

              Ondrej ZizkaO This user is from outside of this forum
              Ondrej ZizkaO This user is from outside of this forum
              Ondrej Zizka
              wrote last edited by
              #78

              @stefano Thanks for all the info about the company's internal setup.

              Stefano MarinelliS 1 Reply Last reply
              0
              • Stefano MarinelliS Stefano Marinelli

                A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                I then suspected a power failure, but the UPS should have sent an alert.

                The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                Never rely only on internal monitoring. Never.

                #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                George E. 🇺🇸♥🇺🇦🇵🇸🏳️‍🌈🏳️‍⚧️G This user is from outside of this forum
                George E. 🇺🇸♥🇺🇦🇵🇸🏳️‍🌈🏳️‍⚧️G This user is from outside of this forum
                George E. 🇺🇸♥🇺🇦🇵🇸🏳️‍🌈🏳️‍⚧️
                wrote last edited by
                #79

                @stefano@mastodon.bsd.cafe
                In the critical infrastructure sector controls are designed to fail open (as in break the circuit) and monitoring systems all have watchdog timers. If an "I'm still here!" ping is not received when it's expected to be received, an alarm goes off.

                I say this not to distract from your original point.

                External monitoring for critical systems is a must.

                1 Reply Last reply
                0
                • Pedro BufulinP Pedro Bufulin
                  @stefano how do you think they managed to burn 4G? I suppose the battery for 4G should not even be in the same "grid" as the other stuff, right? (Im not sure anymore if I know how electricity works, guess I always took it for granted)
                  Stefano MarinelliS This user is from outside of this forum
                  Stefano MarinelliS This user is from outside of this forum
                  Stefano Marinelli
                  wrote last edited by
                  #80

                  @pedro the 4g router was connected to the same UPS. So it wasn't destroyed, just off.

                  1 Reply Last reply
                  0
                  • Ondrej ZizkaO Ondrej Zizka

                    @stefano Thanks for all the info about the company's internal setup.

                    Stefano MarinelliS This user is from outside of this forum
                    Stefano MarinelliS This user is from outside of this forum
                    Stefano Marinelli
                    wrote last edited by
                    #81

                    @OndrejZizka I never named the company 😉

                    1 Reply Last reply
                    0
                    • Stefano MarinelliS Stefano Marinelli

                      A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                      I then suspected a power failure, but the UPS should have sent an alert.

                      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                      Never rely only on internal monitoring. Never.

                      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                      josephM This user is from outside of this forum
                      josephM This user is from outside of this forum
                      joseph
                      wrote last edited by
                      #82

                      @stefano And while not relying on internal monitoring make sure your external monitoring doesn't share anything with the monitored systems:

                      Different ISP, different cloud provider if in the cloud, no shared infra at any level

                      1 Reply Last reply
                      0
                      • Stefano MarinelliS Stefano Marinelli

                        A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                        I then suspected a power failure, but the UPS should have sent an alert.

                        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                        Never rely only on internal monitoring. Never.

                        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                        SharquaydiusS This user is from outside of this forum
                        SharquaydiusS This user is from outside of this forum
                        Sharquaydius
                        wrote last edited by
                        #83

                        @stefano zapping the power lines, eh? Looks like the perfect solution to my nuisance neighbors with the big loudspeakers.

                        1 Reply Last reply
                        0
                        • Stefano MarinelliS Stefano Marinelli

                          A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                          I then suspected a power failure, but the UPS should have sent an alert.

                          The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                          To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                          The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                          That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                          The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                          The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                          Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                          Never rely only on internal monitoring. Never.

                          #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                          Dan 🌻D This user is from outside of this forum
                          Dan 🌻D This user is from outside of this forum
                          Dan 🌻
                          wrote last edited by
                          #84

                          @stefano The true horror part of this story:

                          > The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                          Home for the holidays, sick, serious family issue?? Who cares! You know what's more important?? Keeping that data center up and running!

                          Glory to sacrificing yourself for the system!!

                          Or maybe get someone else next time.

                          Stefano MarinelliS SharquaydiusS 2 Replies Last reply
                          0
                          • The GafferT The Gaffer

                            @stefano This immediately brought to mind coming into the office after a holiday weekend in 2005 and finding “my” computer room dark. I found our infrastructure manager, who told me that they had an unexpected power outage over the weekend. Confused, I said “But how is that possible? We have multiple feeds and a huge uninterruptible power supply!”

                            I will never forget his response, delivered in his thick Scottish brogue: “Yes, we do. But it doesn’t do much good when the UPS catches fire.” 😳

                            AsinyA This user is from outside of this forum
                            AsinyA This user is from outside of this forum
                            Asiny
                            wrote last edited by
                            #85

                            @thegaffer @stefano hahahaha

                            1 Reply Last reply
                            0
                            • The GafferT The Gaffer

                              @stefano This immediately brought to mind coming into the office after a holiday weekend in 2005 and finding “my” computer room dark. I found our infrastructure manager, who told me that they had an unexpected power outage over the weekend. Confused, I said “But how is that possible? We have multiple feeds and a huge uninterruptible power supply!”

                              I will never forget his response, delivered in his thick Scottish brogue: “Yes, we do. But it doesn’t do much good when the UPS catches fire.” 😳

                              javensbukanJ This user is from outside of this forum
                              javensbukanJ This user is from outside of this forum
                              javensbukan
                              wrote last edited by
                              #86

                              @thegaffer @stefano That reminds me of an incident that happened at work. We have multiple sources of electricity and generators, but none of that matters if the room with the UPS and power controller where all the power sources meet floods from an overflowing toilet a floor above 🙃😅

                              Whoopsie daisy!

                              I just finished bypassing all the network switches in the closets from that circuit when they managed to bypass it and catastrophe averted.

                              That was a fun night! /s

                              Stefano MarinelliS 1 Reply Last reply
                              0
                              • Stefano MarinelliS Stefano Marinelli

                                A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                I then suspected a power failure, but the UPS should have sent an alert.

                                The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                Never rely only on internal monitoring. Never.

                                #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                mikiM This user is from outside of this forum
                                mikiM This user is from outside of this forum
                                miki
                                wrote last edited by
                                #87

                                @stefano AFAIK, professional alarm systems should function based on the principle that "if it doesn't send periodic alerts saying that everything is ok, and there's no scheduled downtime, then something clearly isn't ok, and somebody needs to be send to investigate it asap."

                                Stefano MarinelliS 1 Reply Last reply
                                0
                                • Dan 🌻D Dan 🌻

                                  @stefano The true horror part of this story:

                                  > The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                  Home for the holidays, sick, serious family issue?? Who cares! You know what's more important?? Keeping that data center up and running!

                                  Glory to sacrificing yourself for the system!!

                                  Or maybe get someone else next time.

                                  Stefano MarinelliS This user is from outside of this forum
                                  Stefano MarinelliS This user is from outside of this forum
                                  Stefano Marinelli
                                  wrote last edited by
                                  #88

                                  @danvolchek to be honest, I offered him to rush there. But he refused and decided to go (he wasn't far from there)

                                  1 Reply Last reply
                                  0
                                  • javensbukanJ javensbukan

                                    @thegaffer @stefano That reminds me of an incident that happened at work. We have multiple sources of electricity and generators, but none of that matters if the room with the UPS and power controller where all the power sources meet floods from an overflowing toilet a floor above 🙃😅

                                    Whoopsie daisy!

                                    I just finished bypassing all the network switches in the closets from that circuit when they managed to bypass it and catastrophe averted.

                                    That was a fun night! /s

                                    Stefano MarinelliS This user is from outside of this forum
                                    Stefano MarinelliS This user is from outside of this forum
                                    Stefano Marinelli
                                    wrote last edited by
                                    #89

                                    @javensbukan @thegaffer suuure...fun... 😆

                                    javensbukanJ 1 Reply Last reply
                                    0
                                    • mikiM miki

                                      @stefano AFAIK, professional alarm systems should function based on the principle that "if it doesn't send periodic alerts saying that everything is ok, and there's no scheduled downtime, then something clearly isn't ok, and somebody needs to be send to investigate it asap."

                                      Stefano MarinelliS This user is from outside of this forum
                                      Stefano MarinelliS This user is from outside of this forum
                                      Stefano Marinelli
                                      wrote last edited by
                                      #90

                                      @miki I agree. In fact, their first idea is to check why they didn't call/intervene

                                      1 Reply Last reply
                                      0
                                      • Dan 🌻D Dan 🌻

                                        @stefano The true horror part of this story:

                                        > The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                        Home for the holidays, sick, serious family issue?? Who cares! You know what's more important?? Keeping that data center up and running!

                                        Glory to sacrificing yourself for the system!!

                                        Or maybe get someone else next time.

                                        SharquaydiusS This user is from outside of this forum
                                        SharquaydiusS This user is from outside of this forum
                                        Sharquaydius
                                        wrote last edited by
                                        #91

                                        @danvolchek @stefano one time, i worked late and then, the alarm would not set. i called the security man and, on the phone, i heard a baby in the background.

                                        This was after 8pm so i felt bad. i asked if that's the baby i can hear and he confirmed. i told him i can chill until 10pm. half an hour later he called to say he's coming soon, so i can lock the door and go home.

                                        this is what happens when we take responsibility. We tackle the messy reality. We make the everyday sacrifices.

                                        1 Reply Last reply
                                        0
                                        • Stefano MarinelliS Stefano Marinelli

                                          A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                          I then suspected a power failure, but the UPS should have sent an alert.

                                          The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                          To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                          The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                          That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                          The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                          The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                          Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                          Never rely only on internal monitoring. Never.

                                          #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                          r1w1s1R This user is from outside of this forum
                                          r1w1s1R This user is from outside of this forum
                                          r1w1s1
                                          wrote last edited by
                                          #92
                                          Internal monitoring can go dark.
                                          External monitoring tells the truth.

                                          Great example of why both matter.
                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups