Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse
Brand Logo
  1. Home
  2. Uncategorized
  3. A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.

A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.

Scheduled Pinned Locked Moved Uncategorized
sysadminhorrorstoriesithorrorstoriesmonitoring
176 Posts 77 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • Pedro BufulinP Pedro Bufulin
    @stefano how do you think they managed to burn 4G? I suppose the battery for 4G should not even be in the same "grid" as the other stuff, right? (Im not sure anymore if I know how electricity works, guess I always took it for granted)
    Stefano MarinelliS This user is from outside of this forum
    Stefano MarinelliS This user is from outside of this forum
    Stefano Marinelli
    wrote last edited by
    #80

    @pedro the 4g router was connected to the same UPS. So it wasn't destroyed, just off.

    1 Reply Last reply
    0
    • Ondrej ZizkaO Ondrej Zizka

      @stefano Thanks for all the info about the company's internal setup.

      Stefano MarinelliS This user is from outside of this forum
      Stefano MarinelliS This user is from outside of this forum
      Stefano Marinelli
      wrote last edited by
      #81

      @OndrejZizka I never named the company 😉

      1 Reply Last reply
      0
      • Stefano MarinelliS Stefano Marinelli

        A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

        I then suspected a power failure, but the UPS should have sent an alert.

        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

        Never rely only on internal monitoring. Never.

        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

        josephM This user is from outside of this forum
        josephM This user is from outside of this forum
        joseph
        wrote last edited by
        #82

        @stefano And while not relying on internal monitoring make sure your external monitoring doesn't share anything with the monitored systems:

        Different ISP, different cloud provider if in the cloud, no shared infra at any level

        1 Reply Last reply
        0
        • Stefano MarinelliS Stefano Marinelli

          A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

          I then suspected a power failure, but the UPS should have sent an alert.

          The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

          To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

          The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

          That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

          The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

          The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

          Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

          Never rely only on internal monitoring. Never.

          #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

          SharquaydiusS This user is from outside of this forum
          SharquaydiusS This user is from outside of this forum
          Sharquaydius
          wrote last edited by
          #83

          @stefano zapping the power lines, eh? Looks like the perfect solution to my nuisance neighbors with the big loudspeakers.

          1 Reply Last reply
          0
          • Stefano MarinelliS Stefano Marinelli

            A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

            I then suspected a power failure, but the UPS should have sent an alert.

            The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

            To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

            The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

            That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

            The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

            The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

            Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

            Never rely only on internal monitoring. Never.

            #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

            Dan 🌻D This user is from outside of this forum
            Dan 🌻D This user is from outside of this forum
            Dan 🌻
            wrote last edited by
            #84

            @stefano The true horror part of this story:

            > The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

            Home for the holidays, sick, serious family issue?? Who cares! You know what's more important?? Keeping that data center up and running!

            Glory to sacrificing yourself for the system!!

            Or maybe get someone else next time.

            Stefano MarinelliS SharquaydiusS 2 Replies Last reply
            0
            • The GafferT The Gaffer

              @stefano This immediately brought to mind coming into the office after a holiday weekend in 2005 and finding “my” computer room dark. I found our infrastructure manager, who told me that they had an unexpected power outage over the weekend. Confused, I said “But how is that possible? We have multiple feeds and a huge uninterruptible power supply!”

              I will never forget his response, delivered in his thick Scottish brogue: “Yes, we do. But it doesn’t do much good when the UPS catches fire.” 😳

              AsinyA This user is from outside of this forum
              AsinyA This user is from outside of this forum
              Asiny
              wrote last edited by
              #85

              @thegaffer @stefano hahahaha

              1 Reply Last reply
              0
              • The GafferT The Gaffer

                @stefano This immediately brought to mind coming into the office after a holiday weekend in 2005 and finding “my” computer room dark. I found our infrastructure manager, who told me that they had an unexpected power outage over the weekend. Confused, I said “But how is that possible? We have multiple feeds and a huge uninterruptible power supply!”

                I will never forget his response, delivered in his thick Scottish brogue: “Yes, we do. But it doesn’t do much good when the UPS catches fire.” 😳

                javensbukanJ This user is from outside of this forum
                javensbukanJ This user is from outside of this forum
                javensbukan
                wrote last edited by
                #86

                @thegaffer @stefano That reminds me of an incident that happened at work. We have multiple sources of electricity and generators, but none of that matters if the room with the UPS and power controller where all the power sources meet floods from an overflowing toilet a floor above 🙃😅

                Whoopsie daisy!

                I just finished bypassing all the network switches in the closets from that circuit when they managed to bypass it and catastrophe averted.

                That was a fun night! /s

                Stefano MarinelliS 1 Reply Last reply
                0
                • Stefano MarinelliS Stefano Marinelli

                  A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                  I then suspected a power failure, but the UPS should have sent an alert.

                  The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                  To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                  The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                  That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                  The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                  The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                  Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                  Never rely only on internal monitoring. Never.

                  #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                  mikiM This user is from outside of this forum
                  mikiM This user is from outside of this forum
                  miki
                  wrote last edited by
                  #87

                  @stefano AFAIK, professional alarm systems should function based on the principle that "if it doesn't send periodic alerts saying that everything is ok, and there's no scheduled downtime, then something clearly isn't ok, and somebody needs to be send to investigate it asap."

                  Stefano MarinelliS 1 Reply Last reply
                  0
                  • Dan 🌻D Dan 🌻

                    @stefano The true horror part of this story:

                    > The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                    Home for the holidays, sick, serious family issue?? Who cares! You know what's more important?? Keeping that data center up and running!

                    Glory to sacrificing yourself for the system!!

                    Or maybe get someone else next time.

                    Stefano MarinelliS This user is from outside of this forum
                    Stefano MarinelliS This user is from outside of this forum
                    Stefano Marinelli
                    wrote last edited by
                    #88

                    @danvolchek to be honest, I offered him to rush there. But he refused and decided to go (he wasn't far from there)

                    1 Reply Last reply
                    0
                    • javensbukanJ javensbukan

                      @thegaffer @stefano That reminds me of an incident that happened at work. We have multiple sources of electricity and generators, but none of that matters if the room with the UPS and power controller where all the power sources meet floods from an overflowing toilet a floor above 🙃😅

                      Whoopsie daisy!

                      I just finished bypassing all the network switches in the closets from that circuit when they managed to bypass it and catastrophe averted.

                      That was a fun night! /s

                      Stefano MarinelliS This user is from outside of this forum
                      Stefano MarinelliS This user is from outside of this forum
                      Stefano Marinelli
                      wrote last edited by
                      #89

                      @javensbukan @thegaffer suuure...fun... 😆

                      javensbukanJ 1 Reply Last reply
                      0
                      • mikiM miki

                        @stefano AFAIK, professional alarm systems should function based on the principle that "if it doesn't send periodic alerts saying that everything is ok, and there's no scheduled downtime, then something clearly isn't ok, and somebody needs to be send to investigate it asap."

                        Stefano MarinelliS This user is from outside of this forum
                        Stefano MarinelliS This user is from outside of this forum
                        Stefano Marinelli
                        wrote last edited by
                        #90

                        @miki I agree. In fact, their first idea is to check why they didn't call/intervene

                        1 Reply Last reply
                        0
                        • Dan 🌻D Dan 🌻

                          @stefano The true horror part of this story:

                          > The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                          Home for the holidays, sick, serious family issue?? Who cares! You know what's more important?? Keeping that data center up and running!

                          Glory to sacrificing yourself for the system!!

                          Or maybe get someone else next time.

                          SharquaydiusS This user is from outside of this forum
                          SharquaydiusS This user is from outside of this forum
                          Sharquaydius
                          wrote last edited by
                          #91

                          @danvolchek @stefano one time, i worked late and then, the alarm would not set. i called the security man and, on the phone, i heard a baby in the background.

                          This was after 8pm so i felt bad. i asked if that's the baby i can hear and he confirmed. i told him i can chill until 10pm. half an hour later he called to say he's coming soon, so i can lock the door and go home.

                          this is what happens when we take responsibility. We tackle the messy reality. We make the everyday sacrifices.

                          1 Reply Last reply
                          0
                          • Stefano MarinelliS Stefano Marinelli

                            A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                            I then suspected a power failure, but the UPS should have sent an alert.

                            The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                            To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                            The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                            That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                            The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                            The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                            Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                            Never rely only on internal monitoring. Never.

                            #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                            r1w1s1R This user is from outside of this forum
                            r1w1s1R This user is from outside of this forum
                            r1w1s1
                            wrote last edited by
                            #92
                            Internal monitoring can go dark.
                            External monitoring tells the truth.

                            Great example of why both matter.
                            1 Reply Last reply
                            0
                            • P This user is from outside of this forum
                              P This user is from outside of this forum
                              pedernal
                              wrote last edited by
                              #93

                              @stefano @mkj @ricardo i guess that's an evolutionary competition... What is the typical voltage surge to catch the oldest zebras?

                              1 Reply Last reply
                              0
                              • Stefano MarinelliS Stefano Marinelli

                                A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                I then suspected a power failure, but the UPS should have sent an alert.

                                The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                Never rely only on internal monitoring. Never.

                                #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                coldclimateC This user is from outside of this forum
                                coldclimateC This user is from outside of this forum
                                coldclimate
                                wrote last edited by
                                #94

                                @stefano tremendous story

                                1 Reply Last reply
                                0
                                • Stefano MarinelliS Stefano Marinelli

                                  A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                  I then suspected a power failure, but the UPS should have sent an alert.

                                  The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                  To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                  The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                  That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                  The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                  The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                  Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                  Never rely only on internal monitoring. Never.

                                  #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                  Ben Lubar (any pronouns)B This user is from outside of this forum
                                  Ben Lubar (any pronouns)B This user is from outside of this forum
                                  Ben Lubar (any pronouns)
                                  wrote last edited by
                                  #95

                                  @stefano my stuff is hosted in my basement and my "monitoring" is that someone will poke me in some kind of chatroom and say "hey is it broken"

                                  and usually when that happens, it's been broken for hours or days already because I design my software to handle my home internet connection or power going down as gracefully as possible

                                  1 Reply Last reply
                                  0
                                  • Stefano MarinelliS Stefano Marinelli

                                    A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                    I then suspected a power failure, but the UPS should have sent an alert.

                                    The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                    To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                    The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                    That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                    The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                    The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                    Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                    Never rely only on internal monitoring. Never.

                                    #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                    lorenzoL This user is from outside of this forum
                                    lorenzoL This user is from outside of this forum
                                    lorenzo
                                    wrote last edited by
                                    #96
                                    #DetectiveBarista
                                    Andreas (82MHz)8 1 Reply Last reply
                                    0
                                    • lorenzoL lorenzo
                                      #DetectiveBarista
                                      Andreas (82MHz)8 This user is from outside of this forum
                                      Andreas (82MHz)8 This user is from outside of this forum
                                      Andreas (82MHz)
                                      wrote last edited by
                                      #97

                                      @lorenzo @stefano
                                      I think Stefano, the mild mannered barista of the BSD Cafe who posts pictures of sunsets and from his walks in nature is just a cover, and in reality he is a tough-as-nails secret military agent who's chasing cybercriminals around the globe.
                                      See also his comment to my blog post about "just telling people to call the Barista" to make them crap their pants... this Barista has a secret! 🕵️

                                      Stefano MarinelliS WesDymW 2 Replies Last reply
                                      0
                                      • Stefano MarinelliS Stefano Marinelli

                                        A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                        I then suspected a power failure, but the UPS should have sent an alert.

                                        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                        Never rely only on internal monitoring. Never.

                                        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                        James ScholesJ This user is from outside of this forum
                                        James ScholesJ This user is from outside of this forum
                                        James Scholes
                                        wrote last edited by
                                        #98

                                        @stefano @andrew Well... that escalated quickly beyond where I was expecting it to go.

                                        1 Reply Last reply
                                        0
                                        • Stefano MarinelliS Stefano Marinelli

                                          A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                          I then suspected a power failure, but the UPS should have sent an alert.

                                          The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                          To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                          The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                          That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                          The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                          The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                          Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                          Never rely only on internal monitoring. Never.

                                          #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                          xinquX This user is from outside of this forum
                                          xinquX This user is from outside of this forum
                                          xinqu
                                          wrote last edited by
                                          #99

                                          @stefano great story, thanks for sharing. Probably @mwl can make a novel "Heroic Stories of a Tiny Router" or so.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups