A few days ago, a client’s data center (well, actually a server room) "vanished" overnight.
-
Mostly blueprints from companies like Cisco, IBM, Google, and so on.
@uriel Ah, so they have their own internal guidelines as to what constitutes a datacenter, but there isn't a central definition from some kind of standards body that has coined the term with the requirement to separate buildings with independant power supplies, safe distances etc.?
-
@Dianora @EnigmaRotor @mwl I could never compete with the Mentor. He's more of a spiritual guide
@Dianora @EnigmaRotor @stefano
Spiritual guide? Your spirits are gonna go somewhere pretty dang weird, sir.
-
@Dianora @EnigmaRotor @stefano
Spiritual guide? Your spirits are gonna go somewhere pretty dang weird, sir.
@mwl @EnigmaRotor @stefano Was that an offer to buy us all a round of beers at BSDCan? *whistles innocently*
-
@uriel Ah, so they have their own internal guidelines as to what constitutes a datacenter, but there isn't a central definition from some kind of standards body that has coined the term with the requirement to separate buildings with independant power supplies, safe distances etc.?
Let's put that way. Some people build datacentres. Some other people don't. The guidelines of people doing it, are little more interesting than the guidelines of people who don't. Is a crazy world, man.
-
A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.
I then suspected a power failure, but the UPS should have sent an alert.
The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.
To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.
The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.
That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.
The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.
The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.
Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.
Never rely only on internal monitoring. Never.
@stefano hi, why havent you posted this into a common blog post?
-
A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.
I then suspected a power failure, but the UPS should have sent an alert.
The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.
To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.
The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.
That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.
The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.
The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.
Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.
Never rely only on internal monitoring. Never.
@stefano wow! That's a story!
-
Let's put that way. Some people build datacentres. Some other people don't. The guidelines of people doing it, are little more interesting than the guidelines of people who don't. Is a crazy world, man.
@uriel Sure is

-
@mwl @EnigmaRotor @stefano Was that an offer to buy us all a round of beers at BSDCan? *whistles innocently*
@Dianora @EnigmaRotor @stefano
beer is too normal...
-
@Dianora @EnigmaRotor @stefano
beer is too normal...
-
@stefano I wasn't aware of this kind of problems with internal monitoring and the importance of external monitoring. However, I think is more important to monitor the monitoring server or to have one heartbeat of the monitoring system (external or internal). Because the external monitoring system could also fail without being aware of it.
@zako Cute joke.
-
@lorenzo @stefano
I think Stefano, the mild mannered barista of the BSD Cafe who posts pictures of sunsets and from his walks in nature is just a cover, and in reality he is a tough-as-nails secret military agent who's chasing cybercriminals around the globe.
See also his comment to my blog post about "just telling people to call the Barista" to make them crap their pants... this Barista has a secret!
️ -
@gumnos @mwl @Dianora @stefano Both may, for sure, be present at the table. #devicedrivers
-
@gumnos @mwl @Dianora @stefano Both may, for sure, be present at the table. #devicedrivers
@EnigmaRotor @gumnos @mwl @stefano Only if the plugboard is also set up right.
-
@EnigmaRotor @gumnos @mwl @stefano Only if the plugboard is also set up right.
-
A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.
I then suspected a power failure, but the UPS should have sent an alert.
The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.
To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.
The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.
That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.
The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.
The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.
Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.
Never rely only on internal monitoring. Never.
@stefano Have to integrate this story into the pitch for our monitoring service

-
A few days ago, a client’s data center (well, actually a server room) "vanished" overnight. My monitoring showed that all devices were unreachable. Not even the ISP routers responded, so I assumed a sudden connectivity drop. The strange part? Not even via 4G.
I then suspected a power failure, but the UPS should have sent an alert.
The office was closed for the holidays, but I contacted the IT manager anyway. He was home sick with a serious family issue, but he got moving.
To make a long story short: the company deals in gold and precious metals. They have an underground bunker with two-meter thick walls. They were targeted by a professional gang. They used a tactic seen in similar hits: they identify the main power line, tamper with it at night, and send a massive voltage spike through it.
The goal is to fry all alarm and surveillance systems. Even if battery-backed, they rarely survive a surge like that. Thieves count on the fact that during holidays, owners are away and fried systems can't send alerts. Monitoring companies often have reduced staff and might not notice the "silence" immediately.
That is exactly what happened here. But there is a "but": they didn't account for my Uptime Kuma instance monitoring their MikroTik router, installed just weeks ago. Since it is an external check, it flagged the lack of response from all IPs without needing an internal alert to be triggered from the inside.
The team rushed to the site and found the mess. Luckily, they found an emergency electrical crew to bypass the damage and restore the cameras and alarms. They swapped the fried server UPS with a spare and everything came back up.
The police warned that the chances of the crew returning the next night to "finish" the job were high, though seeing the systems back online would likely make them move on. They also warned that thieves sometimes break in just to destroy servers to wipe any video evidence.
Nothing happened in the end. But in the meantime, I had to sync all their data off-site (thankfully they have dual 1Gbps FTTH), set up an emergency cluster, and ensure everything was redundant.
Never rely only on internal monitoring. Never.
@stefano Good for you. If next time, you could solve your problems without involving people who are sick at home with a serious family issue on top, that would be great.
-
@stefano Good for you. If next time, you could solve your problems without involving people who are sick at home with a serious family issue on top, that would be great.
@fennek Calling these 'my' problems is inaccurate; I am simply providing services to this company and I have no formal contract or obligation regarding this specific issue. I could have easily ignored the alert, especially since I wasn't aware the person in charge was out sick. Despite this, I offered to step in and handle it myself - even though it’s hours away - to help out and allow them to stay home.
-
@gumnos @mwl @Dianora @stefano Both may, for sure, be present at the table. #devicedrivers
@EnigmaRotor @gumnos @mwl @Dianora is this board powered by a BSD?
-
@tkr I will - but it's too fresh and still not totally over. When I'll have all the final details, this will be a blog post
-
@EnigmaRotor @gumnos @mwl @Dianora is this board powered by a BSD?
