Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (Darkly)
  • No Skin
Collapse
Brand Logo
  1. Home
  2. Uncategorized
  3. A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.

A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.

Scheduled Pinned Locked Moved Uncategorized
sysadminhorrorstoriesithorrorstoriesmonitoring
176 Posts 77 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • mikiM miki

    @stefano AFAIK, professional alarm systems should function based on the principle that "if it doesn't send periodic alerts saying that everything is ok, and there's no scheduled downtime, then something clearly isn't ok, and somebody needs to be send to investigate it asap."

    Stefano MarinelliS This user is from outside of this forum
    Stefano MarinelliS This user is from outside of this forum
    Stefano Marinelli
    wrote last edited by
    #90

    @miki I agree. In fact, their first idea is to check why they didn't call/intervene

    1 Reply Last reply
    0
    • Dan 🌻D Dan 🌻

      @stefano The true horror part of this story:

      > The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

      Home for the holidays, sick, serious family issue?? Who cares! You know what's more important?? Keeping that data center up and running!

      Glory to sacrificing yourself for the system!!

      Or maybe get someone else next time.

      SharquaydiusS This user is from outside of this forum
      SharquaydiusS This user is from outside of this forum
      Sharquaydius
      wrote last edited by
      #91

      @danvolchek @stefano one time, i worked late and then, the alarm would not set. i called the security man and, on the phone, i heard a baby in the background.

      This was after 8pm so i felt bad. i asked if that's the baby i can hear and he confirmed. i told him i can chill until 10pm. half an hour later he called to say he's coming soon, so i can lock the door and go home.

      this is what happens when we take responsibility. We tackle the messy reality. We make the everyday sacrifices.

      1 Reply Last reply
      0
      • Stefano MarinelliS Stefano Marinelli

        A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

        I then suspected a power failure, but the UPS should have sent an alert.

        The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

        To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

        The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

        That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

        The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

        The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

        Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

        Never rely only on internal monitoring. Never.

        #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

        r1w1s1R This user is from outside of this forum
        r1w1s1R This user is from outside of this forum
        r1w1s1
        wrote last edited by
        #92
        Internal monitoring can go dark.
        External monitoring tells the truth.

        Great example of why both matter.
        1 Reply Last reply
        0
        • P This user is from outside of this forum
          P This user is from outside of this forum
          pedernal
          wrote last edited by
          #93

          @stefano @mkj @ricardo i guess that's an evolutionary competition... What is the typical voltage surge to catch the oldest zebras?

          1 Reply Last reply
          0
          • Stefano MarinelliS Stefano Marinelli

            A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

            I then suspected a power failure, but the UPS should have sent an alert.

            The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

            To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

            The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

            That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

            The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

            The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

            Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

            Never rely only on internal monitoring. Never.

            #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

            coldclimateC This user is from outside of this forum
            coldclimateC This user is from outside of this forum
            coldclimate
            wrote last edited by
            #94

            @stefano tremendous story

            1 Reply Last reply
            0
            • Stefano MarinelliS Stefano Marinelli

              A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

              I then suspected a power failure, but the UPS should have sent an alert.

              The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

              To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

              The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

              That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

              The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

              The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

              Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

              Never rely only on internal monitoring. Never.

              #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

              Ben Lubar (any pronouns)B This user is from outside of this forum
              Ben Lubar (any pronouns)B This user is from outside of this forum
              Ben Lubar (any pronouns)
              wrote last edited by
              #95

              @stefano my stuff is hosted in my basement and my "monitoring" is that someone will poke me in some kind of chatroom and say "hey is it broken"

              and usually when that happens, it's been broken for hours or days already because I design my software to handle my home internet connection or power going down as gracefully as possible

              1 Reply Last reply
              0
              • Stefano MarinelliS Stefano Marinelli

                A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                I then suspected a power failure, but the UPS should have sent an alert.

                The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                Never rely only on internal monitoring. Never.

                #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                lorenzoL This user is from outside of this forum
                lorenzoL This user is from outside of this forum
                lorenzo
                wrote last edited by
                #96
                #DetectiveBarista
                Andreas (82MHz)8 1 Reply Last reply
                0
                • lorenzoL lorenzo
                  #DetectiveBarista
                  Andreas (82MHz)8 This user is from outside of this forum
                  Andreas (82MHz)8 This user is from outside of this forum
                  Andreas (82MHz)
                  wrote last edited by
                  #97

                  @lorenzo @stefano
                  I think Stefano, the mild mannered barista of the BSD Cafe who posts pictures of sunsets and from his walks in nature is just a cover, and in reality he is a tough-as-nails secret military agent who's chasing cybercriminals around the globe.
                  See also his comment to my blog post about "just telling people to call the Barista" to make them crap their pants... this Barista has a secret! 🕵️

                  Stefano MarinelliS WesDymW 2 Replies Last reply
                  0
                  • Stefano MarinelliS Stefano Marinelli

                    A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                    I then suspected a power failure, but the UPS should have sent an alert.

                    The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                    To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                    The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                    That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                    The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                    The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                    Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                    Never rely only on internal monitoring. Never.

                    #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                    James ScholesJ This user is from outside of this forum
                    James ScholesJ This user is from outside of this forum
                    James Scholes
                    wrote last edited by
                    #98

                    @stefano @andrew Well... that escalated quickly beyond where I was expecting it to go.

                    1 Reply Last reply
                    0
                    • Stefano MarinelliS Stefano Marinelli

                      A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                      I then suspected a power failure, but the UPS should have sent an alert.

                      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                      Never rely only on internal monitoring. Never.

                      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                      xinquX This user is from outside of this forum
                      xinquX This user is from outside of this forum
                      xinqu
                      wrote last edited by
                      #99

                      @stefano great story, thanks for sharing. Probably @mwl can make a novel "Heroic Stories of a Tiny Router" or so.

                      1 Reply Last reply
                      0
                      • Andreas (82MHz)8 Andreas (82MHz)

                        @lorenzo @stefano
                        I think Stefano, the mild mannered barista of the BSD Cafe who posts pictures of sunsets and from his walks in nature is just a cover, and in reality he is a tough-as-nails secret military agent who's chasing cybercriminals around the globe.
                        See also his comment to my blog post about "just telling people to call the Barista" to make them crap their pants... this Barista has a secret! 🕵️

                        Stefano MarinelliS This user is from outside of this forum
                        Stefano MarinelliS This user is from outside of this forum
                        Stefano Marinelli
                        wrote last edited by
                        #100

                        @82mhz @lorenzo A real photo of Stefano, called "The Barista"

                        lorenzoL 1 Reply Last reply
                        0
                        • Stefano MarinelliS Stefano Marinelli

                          @82mhz @lorenzo A real photo of Stefano, called "The Barista"

                          lorenzoL This user is from outside of this forum
                          lorenzoL This user is from outside of this forum
                          lorenzo
                          wrote last edited by
                          #101
                          💙

                          CC: @82mhz@bsd.cafe
                          1 Reply Last reply
                          0
                          • Stefano MarinelliS Stefano Marinelli

                            A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                            I then suspected a power failure, but the UPS should have sent an alert.

                            The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                            To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                            The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                            That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                            The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                            The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                            Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                            Never rely only on internal monitoring. Never.

                            #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                            Uriel FanelliU This user is from outside of this forum
                            Uriel FanelliU This user is from outside of this forum
                            Uriel Fanelli
                            wrote last edited by
                            #102
                            In the first sentence you mention a "data center", but such an attack would not work with a data center, to be one you need to have two buildings with independent power supply, at a safe distance, etc etc. I think this was at best a hosting room, not a data center.
                            Stefano MarinelliS feldF Snep :floofHappy:S 3 Replies Last reply
                            0
                            • Uriel FanelliU Uriel Fanelli
                              In the first sentence you mention a "data center", but such an attack would not work with a data center, to be one you need to have two buildings with independent power supply, at a safe distance, etc etc. I think this was at best a hosting room, not a data center.
                              Stefano MarinelliS This user is from outside of this forum
                              Stefano MarinelliS This user is from outside of this forum
                              Stefano Marinelli
                              wrote last edited by
                              #103

                              @uriel sure - we tend to call "data center" a specific place, inside the company, that will host the servers (with A/C, etc). Maybe a little inappropriate, here.

                              Uriel FanelliU 1 Reply Last reply
                              0
                              • Stefano MarinelliS Stefano Marinelli

                                A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                I then suspected a power failure, but the UPS should have sent an alert.

                                The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                Never rely only on internal monitoring. Never.

                                #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                TionislaT This user is from outside of this forum
                                TionislaT This user is from outside of this forum
                                Tionisla
                                wrote last edited by
                                #104

                                @stefano wow, cool story and well done! 👍

                                And yes sometimes the truth is really better than fiction (thinking about about something a while back I was part in in my job that could have been easily from a badly scripted reality TV show. Can't go into details because of nda 🙈 )

                                Stefano MarinelliS 1 Reply Last reply
                                0
                                • TionislaT Tionisla

                                  @stefano wow, cool story and well done! 👍

                                  And yes sometimes the truth is really better than fiction (thinking about about something a while back I was part in in my job that could have been easily from a badly scripted reality TV show. Can't go into details because of nda 🙈 )

                                  Stefano MarinelliS This user is from outside of this forum
                                  Stefano MarinelliS This user is from outside of this forum
                                  Stefano Marinelli
                                  wrote last edited by
                                  #105

                                  @Tionisla Thank you. Yes, this is true. Sometimes things IRL are stranger than in fiction. And, if I look back, I've lived some incredible experiences. If I told it to my 20-year-old self, I would never have believed it

                                  TionislaT 1 Reply Last reply
                                  0
                                  • Stefano MarinelliS Stefano Marinelli

                                    @Tionisla Thank you. Yes, this is true. Sometimes things IRL are stranger than in fiction. And, if I look back, I've lived some incredible experiences. If I told it to my 20-year-old self, I would never have believed it

                                    TionislaT This user is from outside of this forum
                                    TionislaT This user is from outside of this forum
                                    Tionisla
                                    wrote last edited by
                                    #106

                                    @stefano heh, yeah and even now you have to sit down rub your eyes and go "wtf". 😄

                                    1 Reply Last reply
                                    0
                                    • Stefano MarinelliS Stefano Marinelli

                                      A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.

                                      I then suspected a power failure, but the UPS should have sent an alert.

                                      The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.

                                      To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.

                                      The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.

                                      That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.

                                      The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.

                                      The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.

                                      Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.

                                      Never rely only on internal monitoring. Never.

                                      #IT #SysAdmin #HorrorStories #ITHorrorStories #Monitoring

                                      Dianora (Diane Bruce)D This user is from outside of this forum
                                      Dianora (Diane Bruce)D This user is from outside of this forum
                                      Dianora (Diane Bruce)
                                      wrote last edited by
                                      #107

                                      @stefano I must repeat this Never trust in onsite backups either. Fire will destroy those. And RAID is not backup.
                                      You know this but it bears repeating!

                                      Stefano MarinelliS 1 Reply Last reply
                                      0
                                      • Stefano MarinelliS Stefano Marinelli

                                        @uriel sure - we tend to call "data center" a specific place, inside the company, that will host the servers (with A/C, etc). Maybe a little inappropriate, here.

                                        Uriel FanelliU This user is from outside of this forum
                                        Uriel FanelliU This user is from outside of this forum
                                        Uriel Fanelli
                                        wrote last edited by
                                        #108
                                        Well, not "a little". The one you described is - at best - a server room, not even a hosting center, since according with the blueprints, there was no redundancy....
                                        Stefano MarinelliS 1 Reply Last reply
                                        0
                                        • Dianora (Diane Bruce)D Dianora (Diane Bruce)

                                          @stefano I must repeat this Never trust in onsite backups either. Fire will destroy those. And RAID is not backup.
                                          You know this but it bears repeating!

                                          Stefano MarinelliS This user is from outside of this forum
                                          Stefano MarinelliS This user is from outside of this forum
                                          Stefano Marinelli
                                          wrote last edited by
                                          #109

                                          @Dianora absolutely! No local backup is a safe backup.

                                          The Psychotic Network FerretN 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups