Tag: thought experiment

On Secession

I’ve seen talk of secession since SCOTUS reused to hear Texas v Pennsylvania et al. I’ve also heard liberals wish them a fond fairwell. Unlike the Civil War era were the nascent industrialized North had financial need for the agricultural production of the South … it’s a different financial picture today. Unfortunately, the net balance of payments to/from the federal government gets cited as a depiction of this economic reality. Deficit spending means *most* states get back more than they pay in — there were less than a dozen states with a negative balance of payment, and the federal outlay in Virginia alone exceeded the total excess intake from those states. But, yeah, I expect many liberal states would be economically viable. As would many conservative states. Liberal states that aren’t economically viable should be OK too — accepting redistribution of wealth is a tenant of liberalism. It’s the poorer conservative states that have a problem.
 
Of course the secessionists haven’t thought it through; they are throwing a tantrum. As a thought experiment, though, I tried to think through the creation of Trumptopia. I cannot conceptualize a fully functioning, unified nation. A common enemy is a great way to unite people. Get rid of that common enemy (so-called socialists), and I expect they’ll discover a lot of disagreement. More prosperous states won’t want to subsidize poorer states (a resentment I remember from German unification — glad to have the country reunited, but the economic hit for former West Germans really sucked). And, generalization aside, there are urban, liberal outposts like Memphis, Austin, etc. that won’t be keen on getting dragged along with the rest of the state. Unless secessionists are looking to go the route of Greek city-states, which brings its own set of challenges.
 
Even if Trumptopia managed to form somewhere, I expect Republicans have a libertarian/bear problem … letting industry self-regulate sounds good while we have a lot of federal regulation because we’re free to imagine industries as honest, pretend there’s enough competition for consumer choice to force acceptable behavior, and assume consumers are sufficiently well informed in their choices. Same with individual freedom — we’re making an a priori assumption that everyone else’s decision will line up with our rationalizations (see: above bears).

Ransomware

My company held a ransomware response through experiment recently – and, honestly, every ransomware response I’ve seen has been some iteration of “walk through backups until we find good files”. Maybe use something like the SharePoint versioning to help identify a good target date (although that date may be different for different files … who knows!). But why wouldn’t you attempt a proactive identification of compromised files?

The basis of ransomware is that it encrypts data and you get the password after paying so-and-so a bitcoin or three. Considering that NGO virus authors (e.g. those who aren’t trying to slow down Iran’s centrifuges) are generally interested in creating mayhem. There’s not a lot of disincentive to creating mayhem and making a couple of bucks. I don’t anticipate ransomware to become less prevalent in the future; in fact I anticipate seeing it in vigilante hacking: EntityX gets their files back after they publicly donate 100k to their antithesis organisation.

Since it’s probably not going away, it seems worthwhile to immediately identify the malicious data scrambling. Reverting to yesterday’s backups sucks, but not as much as finding that your daily backups have aged out and you’re stuck with the monthly backup from 01 Nov as your last “good” data set. It would also be good to merge whatever your last good backup is into the non-encrypted files so the only ‘stuff’ that reverts is a worthless scramble of data anyway. Sure someone may have worked on the file this morning and sucks for them to find their work back-rev’d to last night … but again that’s better than everyone having to reproduce their last two and a half months of work.

Promptly identifying the attack: There are routine processes that read changed files. Windows Search indexing, antivirus scanner, SharePoint indexing. Running against the Windows Search index log on every computer in the organisation is logistically challenging. Not impossible, but not ideal either. A central log for enterprise AV software or the SharePoint indexing log, however, can be parsed from the data centre. Scrape the log files for “unable to read this encrypted file” events. Then there are a myriad of actions that can be taken. Alert the file owner and have them confirm the file should be encrypted. Alert the IT staff when more than x encrypted files are identified in a unit time. Check the create time-stamp and alert the file owner for any files that were created prior to encountering them as encrypted.

Restoring only scrambled files: Since you have a list of encrypted files, you have a scope for the restore job. Instead of restoring everything in place (because who has 2x the storage space to restore to an alternate location?!). Restore just the recently identified as encrypted files – to an alternate location or in place. Ideally you’ve gotten user input on the encrypted files and can omit any the user indicated they encrypted too.