FINALLY! An Actual Use for QR Codes!!!

February 17, 2020 11:04pm
Tagged with:

There’s no sense in holding back now – I’ve honestly always thought that QR codes were kind of dumb.

BUT…

I just stumbled across a pretty cool idea for them at home that I wanted to share!

You see, I just bought a new wifi access point, so I’m doing some spring cleaning on my home network and I ended up renaming the SSID that my guest network uses.

This is notable because I’m one of those weird people who believes in long, simple passwords instead of complicated strings that are hard to remember and thus I’ve grown accustomed to many an eye-roll when people ask what my wifi password is at home … often times resulting in guests just handing over their devices and asking me to type it in for them…

But no more!

So I found this blog post talking about an office posting a QR code that visitors could use to easily login to their guest wifi and I figured, why not try that at my house, too?!

I took it a step further by using this free website that let’s you generate your own QR code in about 30 seconds…

And the end result, after a couple of quick tests to watch the magic work, was this simple Word doc that I could print out and stick to the fridge for easy access anytime someone visits:

For the most part, I still think that these things are dumb – maybe because they just never got the penetration here in the States to justify companies pasting them all over everything instead of using their URLs – but if it saves me from being tasked with typing my 46-character wifi password into the phones of every friend and family member who comes to my house, then I will take one for the team and admit … this particular use case is kind of neat. 😉

Evolution of a Media Collection

November 23, 2019 3:00pm
Tagged with:

I wish I had kept better records of this over time – I’m basically just going by when I added new drives, but it’s still crazy to see how this collection has grown over only five years time…

Consider this – when I first got interested in collecting media back in my early twenties, I started with three 80 GB hard drives … one was fully dedicated to music, another animated TV (mostly The Simpsons, Futurama, and Duckman), and the third was live TV. These disks were filled first with a 56k modem that incessantly redialed all night long, and then later by a 1 Mbps cable modem.

Now here in 2019, I just finished building out a 106 TB NAS, with a 500 Mbps fiber line to fill it.

It kind of makes me wonder just how long the remaining 35 TB left on my new NAS are going to last me, especially when the data somewhat shows how I tend to go through a bit of a spike in downloads whenever I have new disk space available to me. 😉

To Migrate 70 Terabytes…

November 23, 2019 2:44pm
Tagged with:

It’s true.

The project that I started almost two months ago to migrate all of my home server data from rapidly aging desktop hardware to a rackmount NAS is finally completed.

What was previously around 60 TB spread across 9 hard drives of varying shapes, sizes, and ages has now been moved to a dozen 10-12 TB drives all born within the last year, including the addition of two parity drives for redundancy in a new-to-me server that will be dedicated to nothing but storing files, thus finally separating out Plex and the various apps that I use to download media to their own hardware where disk conflicts should officially be a thing of the past!

Of course, it didn’t take that full two months solely to move the data from one set of drives to the other … even though at times it certainly felt like it…

A good chunk of time was spent waiting for Unraid to clear and format new drives – a little over a day for 12 TB drives. 😯

I also had to limit when I could migrate data so as to not impact Plex, considering both that copying at full tilt ate up a lot of CPU on my old server AND I found that copying at full tilt into the new server would make it difficult to stream media from the affected drive at the same time.

I ended up counteracting the latter by adding a 1 TB SSD cache drive to Unraid, which unfortunately limited me to moving about 1 TB at a time because the mover process that moves data from the cache to the array (normally at night) is equally intensive.

That said, most of the speeds I got from the old server weren’t enough to matter anyways. For drives attached directly to the motherboard, I could average speeds of 60 MBps, however a good chunk of my media was living on external USB drives which meant that it was more likely for my transfers to crawl along at 30 MBps instead…

Comical when the SSD can do upwards of 90-100 MBps and even higher read speeds, but hey – I knew that speed wasn’t one of the selling points of going with Unraid, anyways.

Those two months also included a disk recovery … truth be told, I actually lost two disks that prompted me finally putting all of this into motion! One was a lost cause and I just made a list and re-downloaded everything over time, but the second I left alone until everything else was done and then was able to recover using this great free app that I found called testdisk. It turns out that basically the partition table had gotten corrupted somehow – a problem that actually already affected me once before that I previously had been able to repair, but this time once I realized that testdisk would allow me to copy the contents over to another disk that I now had to spare, I opted to just do that instead and about 12 hours later roughly 490 movies were sitting on a fresh disk and ready to migrate over to the new array!

So anyways, as of yesterday now everything is living on the new server and I’m basically ready to power down old faithful and prepare it for its afterlife. I think I decided that once I get the cables I’m waiting on to move the new NAS into my rack in thcloset, I’m going to bring that old server down to my office and disassemble it, give it a good cleaning and actually remove the dead drives that are still installed, and then I’m going to wipe the thing and turn it into a sandbox of sorts for a few random things that don’t really have a place on my other servers…

  • Plex media local backup – Until I can build out a proper backup NAS, I’m going to take a couple of leftover 8 TB drives and backup the most essential 16 TB of media in my collection for an additional backup on top of the 1 TB that I’m now backing up to the cloud.
  • Torrent seeding – I found this great docker for Transmission that incorporates OpenVPN for a seamless experience, which makes me feel a little more comfortable having it running full time to help seed some of the more difficult to find files that I had to hunt for after not being able to get them from usenet.
  • DVD ripping station – Right now the only computers left in this house that still have CD drives are one rackmount server and a very old laptop that I first trialed Plex on before moving it to my desktop hardware. I actually bought an internal bluray drive shortly after I started getting into Plex because I thought that I’d end up ripping all of my media instead of just downloading it, so it’s been buried underneath my desk for about four years now. Nonetheless, I want to install it in this version of the computer to have something a little more accessible for the random DVDs that I have to rip myself.

Not sure how much else will end up there simply because the CPU inside is pretty weak at this point, but I’m really trying to keep my main environments a little cleaner and not just install any old random thing that I come across, so this will be a good place for that because it won’t really matter if there are disruptions.

Looking forward to writing up a separate post outlining all of the reasons why I love Unraid now that I’ve been using it for a couple of months, and of course, I’m already working on expansion plans to move beyond the 106 TB limit that I currently have installed in my already very full, new NAS today! 😉

Rediscovering Music, Digitally

September 24, 2019 2:48pm
Tagged with:

Recently I decided to revisit the seemingly gargantuan task of reorganizing my digital music collection.

It’s something that I’ve been putting off even longer than updating my backup plan because I honestly don’t listen to music much except for maybe when I’m in the car, even though it seems silly to only have access to about 20 albums on my phone when I’ve got upwards of 100 GB of music sitting on the server at home.

But really, therein lies the problem – I’ve found that while Plex has been my loving savior for roughly 99.5% of my digital media woes, the one area where it seems to fall short is in organizing my music because of how it identifies songs … or at least tries to, anyways.

It turns out that despite going through the steps many, many moons ago to convert all of my mountains of CDs that I acquired through the likes of BMG and Columbia House to MP3s, apparently the tags that got embedded in the files are inconsistent as all crap. It never really bothered me because I had the files themselves organized by genre, artist, and album, and I’d play everything with Windows Media Player (or WinAmp if we really want to whip the llama’s ass…).

#geeknostalgia

Anyways, it turns out that when you tell Plex to use a file’s tags, it takes that directive very seriously, even when to my regular, human eyes some of them are absolute garbage! Completely ignoring directory structure, it would mix tracks among different albums and sometimes even classify music under several different artists if their names were spelled incorrectly across the various tracks!

It sucks, which is why I’ve put the project off for so long, however lately I’ve been thinking a lot about the data I hoard and how it makes all of the sense in the world to store it in a format that’s actually useful for its consumption, so it was time to finally start addressing the problem…

…which in my case means importing one or two artists at a time and refreshing my Plex library, then reviewing the results and making any manual changes to group songs together correctly, list multi-disk sets correctly, and so forth.

After several hours of work, Plex tells me that so far I’ve added a whopping 19 artists to my library, so I’ve clearly got a long ways to go, but the plus side of all of this is that I’ve been stumbling back across all of these great songs that I used to love at various times over the years.

So I thought it might be fun to share a few here – most are from my college days, though Led Zeppelin I listened to pretty much religiously back in high school! It’s amazing how beautiful some of those guitar tracks are that I’ve completely forgotten about…

Maybe as I come across others, I’ll write a little something about select favorites and what they mean to me … seeing as that was actually the original intention of this blog post when I first started writing. 😉

A Change of Backup…

September 23, 2019 3:51pm
Tagged with:

This past weekend I finally made the decision to switch backup providers from CrashPlan to Backblaze.

I think.

It seems like I’ve been using CrashPlan forever at this point – at least 5 years now – and it’s honestly something that I just setup a long time ago and left alone … like you’re supposed to with any good backup! 😉

The problem is, and it’s one that I’ve admittedly been ignoring for a while now, is that over the last couple of years CrashPlan’s price has crept up while at the same time its feature set crept down, so I honestly haven’t been getting the value out of it that I once was oh so long ago…

I believe the cost was $5/month/computer when I first started using CrashPlan, and I used it for both my laptop as well as critical files on my home server (which was cool because they had a Linux client that was really easy to use!). Then a few years later, they unexpectedly dropped home support, which was going to double the price in the long term … though in their defense, they offered a 50% discount off home pricing for one year to ease in the transition.

So basically my pricing went from $10 -> $5 -> $20 per month over a few years time!

The bigger hit was that this summer they added a special exclusion for Plex files, which was a big part of what I backed up off my server. I didn’t try to send them my entire library of dozens of TB, mind you, but it seemed reasonable to send them 20 GB of config and metadata so that I could restore Plex easily if the server bit the big one.

In total, I had something like 400 GB backed up with CrashPlan – roughly 200 GB of personal photos and writing and everything else from my laptop, and another 200 GB of Plex config data and some music and other hard to replace archived stuff from my server.

So anyways…

It’s been eating at me for a while that I needed to make a change.

I’ve actually followed Backblaze for a long time because I love how open they are with how they store massive amounts of data. I guess I always just thought that their usage-based plan was too expensive for my needs because I didn’t want to go with another $5/month plan and their unlimited plan doesn’t support Linux anyways.

The funny thing is, apparently when you’re already spending $20/month on backups, that’s enough to store about 4 TB of data using Backblaze’s B2 system!

I think part of the problem has been that whenever I looked at their pricing in the past, I always equated it to backing up my entire data collection – including what’s now 60+ TB of TV shows and movies for Plex – which in turn ends up being something like $300/month and is completely unreasonable for a simple backup strategy!

Yet after now having endured a couple of hard drive failures across my collection, I’m starting to realize that there are certainly subsets of my data that are easier to replace than others. And so instead of B2 being this out of reach backup strategy for all of my data, it suddenly became a new opportunity to go from 400 GB backed up with CrashPlan to nearly 4 TB backed up with Backblaze for about the same monthly cost.

😯

Maybe I’ll do a separate post that’s a little more technical when I finally pull the plug … CrashPlan renews again on 10/10, so I’ve got a couple of weeks to test the waters to make sure I’m truly happy with Backblaze before I cancel one account and fully commit to the other. But so far, I’m pretty satisfied.

I found this free, open source software called Duplicati to manage the backups themselves, and it was super easy to install on both MacOS and CentOS. Within about 36 hours time this past weekend, I had 220 GB from three separate machines backed up to B2, which according to their calculator will run me about $1.10/month, so that’s cool! 🙂

I still need to do some testing on restores to see how that works, but it seems fairly straightforward via Duplicati.

I think in all of my years of using CrashPlan, I had to do one restore and it was 100 GB of music when a drive failed in my server. Their client made it just about seamless, so here’s to hoping for a similar experience with the new guard as well…

So to any sysadmins who do this kind of stuff on a daily basis, this is going to seem way obvious, but for somebody who doesn’t and has been struggling with this literally for months … let’s just say I’m pretty happy to finally have figured this out!

Also, this post is mostly for documentation’s sake so that I have a place to look back to when I need to do it again sometime many moons into the future…

It’s hard to believe that it’s been over a year already since I migrated my Plex server off of my old desktop hardware over to a proper rackmount server. Or at least Plex itself migrated, while the bevy of hard drives that 50+ TB of media lives on still resides in that aged and ever-waning PC.

Anyways, last June when I made the big leap to server-grade hardware, I only had a single hard drive to run VMs from for the new machine. For simplicity’s sake, I set it up as a RAID 0, single disk array, with the understanding that I could “easily” add more disks a few months later and re-configure that array into a more resilient RAID 5.

In fact, according to Amazon I did buy two more drives to use for said purpose in September 2018.

And just yesterday I finally got them working!

You see, it was probably too easy for me to setup that initial RAID 0 array via the new server’s BIOS. At the time, it seemed simple enough to add more drives to the pool and then reconfigure the array itself.

But one thing I’ve learned somewhat painfully since I first set this server up is that everything is more picky than that. Versions have to line up with the hardware, and older versions lack features supported by newer versions, even while they’re all being supported by the companies in parallel. This isn’t really news to me, but it’s certainly something that I never had to scrutinize to this extent.

With my old desktop server…

  1. Connect new hard drive.
  2. Find it in the CentOS Disks GUI, quick format it, and mount it.

With my new server…

  1. Connect new hard drive.
  2. Try to add it to my RAID pool via the RAID controller, but you can’t.
  3. Try to add it via ESXi, but you can’t.
  4. Try to connect via Dell OpenManage, but I didn’t install the server-side software in ESXi right because Dell’s support page for this server only goes up to ESXi 6.0 even though I’m running 6.5 and then I finally find the right software on a support doc found via Google.
  5. Try to connect via Dell OpenManage, but they only make a Windows client so I have to find a laptop to do that.
  6. Try to connect via Dell OpenManage, but the server doesn’t have a certificate and the login failure doesn’t mention that this is a big deal, so you just guess until you see a checkbox mentioning ignoring that and finally it works!
  7. Add new disks to RAID pool and reconfigure from RAID 0 to RAID 5 … and wait a very long time.
  8. Worry that I didn’t make backups of my VMs because I couldn’t figure out how to do it precisely the entire time.
  9. Try to expand virtual disk via ESXi now that the extra space is available, but it still doesn’t see it.
  10. Confirm via Dell OpenManage that the reconfigure is definitely done now and showing the extra space as available.
  11. Wait until 1:30am when nobody is using Plex and just reboot the whole thing, just in case.
  12. Try to expand virtual disk via ESXi, and now it sees it!
  13. Allocate additional space to new VM and reboot that VM, but it does nothing.
  14. Spend an hour Googling for instructions about how to allocate the new space inside of the guest OS until I finally found this random support post that ends up working not unlike magic!
  15. Verify that the new disk space is finally ready to use in the VM, and then debate whether it’s going to be enough or if I should’ve bought yet another disk just in case…

I mean, looking back logically it does make sense – first add the physical drives, then add them to the RAID pool, then rebuild the RAID array, then add new space to the Virtual Disk, then allocate the new space to a specific VM, then update the VM to recognize its new resources … maybe I was just hoping it would be slightly more seamless, even if only in parts! 😛

If anything, I guess it should be a tad easier the next time around, and now that I’ve gotten the bugs worked out of OpenManage, that alone is one less headache to worry about.

That said, I don’t want to rely on my work laptop for managing this server (and others down the road) indefinitely, so it also means I need to put together some sort of Windows box to sit in the corner and collect dust until it’s needed once in a blue moon…

Still, my Plex environment … minus the media itself … now lives on a cushy, new RAID 5 array that could sustain a single disk failure without missing a beat, plus I’ve got some extra cushion for downloading new stuff to boot.

Not too shabby for only ten months worth of work!

Addicted to … Light Bulbs?

February 11, 2019 12:49pm
Tagged with:

Over the weekend I splurged and bought a few of those fancy Philips Hue smart light bulbs with some of our tax refund.

And yes, before we get too far into this – I still think that they’re overly expensive…

…but they’re also undeniably really cool!

I ended up with two sets of lights – three of the color bulbs to update the ceiling fan in my office, and also a light strip to replace the cheap one I ran along the ledge in our living room that burned out about four weeks after I installed it.

All told, I spent about $250 … which I know is crazy for a handful of light bulbs!

(hint: Definitely shop around and pay attention to bundles for the best prices – I got a starter kit with 3 bulbs, the hub, and a switch for $120 that would normally cost $195 separately, whereas the two bulb and hub kit was $100 and the four bulb and hub kit was $200. Best Buy was also cheaper than Amazon for me.)

But so far I’m pretty impressed with their versatility and ease of use once I got the first one setup.

My biggest struggle was getting their hub installed, and to no real fault of Philips – turns out I didn’t have any empty ports left on my router, or outlets left on the nearby power strip, so I had to do some juggling there to hook up a new switch and swap out the power strip for one that better accommodates the bulky power supplies that way too many people still use.

Other than that, the only real setup pain was having to push the button on top of the hub to authenticate it with their app on my phone, which was only a pain because all of our networking stuff is on a high shelf in our bedroom closet, so I had to walk all the way across the house a couple of times to do it.

That said, from a security standpoint I kind of like that the Hue Hub requires a physical connection to your network rather than wifi because it’s a lot more secure than relying on customers to change the default password once they set it up, which would likely never happen.

So on to the lights themselves!

I mean, it’s still about 90% novelty, but I thought it was really neat to be able to change the colors of the lights in my office pretty much in real time just by moving my finger around the color map on my phone! I also like the preset scenes that you can pick from, although I wish that there were more of them. There might be a way to download more – I only spent a little time playing around with things in the Lab, but it looks like there’s a lot of experimentation to choose from as well as 3rd party apps that do animations and stuff, too.

I even let Christopher play with them for a few minutes and he thought it was pretty cool, too!

For my office, I’ve basically got the three color bulbs in my ceiling fan, and then I also have both a dimmer switch/remote thingy as well as a motion sensor linked to them. I’m still unsure on the switches because frankly, I don’t know where to put them where the kids won’t constantly steal them! But I thought the motion sensor was really cool because it actually let me not only set the lights to come on when I walk through the door, but also come on to different scenes based on the time of day!

Right now I have it set to normal lighting during the day and then a much dimmer, tropical scene of blue and green after 11pm.

really want to experiment with trying out different types of lighting for when I’m writing late at night because I think it might be a cool way to help set the mood based on what it is that I’m trying to write!

That said, my other purchase was a light strip (and an extension) to go out in the living room – mostly to be used as ambient light on this ledge we have that runs the length of the room up by the ceiling. I think you’re supposed to put plants and stuff up there, but we’ve just got a couple of pots, a pillow, and a Mickey & Minnie statuette from when we got married. And also now some neato blue lights!

This is a lighting scheme that has evolved for a couple of years now. I first just had some white Christmas lights up there, which turned us on to the idea of having ambient lighting up there but unfortunately burned out after only a couple of weeks of 24×7 use. Next I picked up a much cheaper version of these lights from some random seller on Amazon, which were fine for the most part except that the remote was very finicky and almost never worked, and sometimes the colors flickered and had varying brightness.

For what it’s worth, I paid less than $20 for those and $120 for the new Philips lights.

I honestly think that I like these ones even more than the ones in my office because although I initially set them to a light blue just for some nice ambient light, I changed them to white later after Christopher had fallen asleep on the couch and was surprised that they let off enough light that I didn’t have to turn any of the others on to do any work while he was sleeping. That was a nice bonus that I hadn’t expected.

Now – for the future…

Again, even though these things are still pretty expensive – about $40 for a single bulb whereas I typically buy CFLs or their LED replacements for maybe $6 a piece – I’ve already found myself scoping out the next places around the house where I want to install them!

I think they’d be really great in the hallway at night to add a nightlight effect that we sort of have with our Nest smoke detectors, but not nearly as bright or versatile.

I want to update the ceiling fan in our bedroom just like my office to give us a change while sitting in bed getting ready to fall asleep.

They also just recently released some outdoor bulbs, including ones to go on the front corners of the garage which I think would be really cool, although I might have to upgrade our wifi first because I’m not sure if our existing AP’s range goes that far.

I mean, sure, to do the entire house all at once would just be ridiculous. I think I counted up the other day and found a total of about 50 bulbs around my house, so that’s $2,000 in light bulbs right there and that doesn’t even include switches and sensors and stuff! But if you were to do it maybe a room at a time, I think that’s a lot more manageable, plus it gives me some time to work out the kinks like figuring out the best way to handle switches so that they don’t end up walking away on us.

Eventually I’d love to get to the point where everything is switched over, including adding some accent lighting around the garden out front and the pool area in the back. In theory it’d be neat to then start to tie some of these things together – i.e. turn on the hot tub, set the lights, turn on some music, and pour me a drink…

Ok, maybe not that last one just yet, but the future is right around the corner! 😉

I used to be of the mindset that because domain names are relatively cheap, there’s really no reason why a person’s assorted web projects shouldn’t stay online indefinitely.

And yet right now, I’ve got a couple of different domain names coming up for renewal … and I’m not so sure that I want to bother renewing them anymore.

In total right now I have about a dozen domains registered and I think if you twisted my arm, I could make reasonable arguments for keeping maybe seven of them. It used to drive me nuts when I’d see domains snatched up by spammers just looking to make a few bucks in bulk off the old site’s referral traffic as long as it lasted, although surprisingly if I lookup the last handful of names that I’ve let lapse … they’re actually able to be registered right now.

And not even for $1,200 from a reseller, either!

Maybe that bizarre bubble just finally burst, but it’s still increasingly hard to justify when A) the sites get almost no traffic, and B) if I want to maintain them, I can always throw them under a subdomain on this site like I’ve done in the past. At this point they’re just as good to my portfolio as xyz.scottsevener.com as they are xyz.com … plus, it saves me ten bucks for every one I kill off.

I’ve got four more days to make my decision… 😯

Musing About Disney World and Stats

January 16, 2019 3:30pm
Tagged with:

There are some times when I would really love to get a glimpse of some of the statistics around Walt Disney World.

My curiosity this morning is specifically around Disney Vacation Club (DVC) and how many members there actually are in order to better gauge how many people are really fighting over the tickets to the Moonlight Magic events that are continuing again this year. Today was the opening of registration for the first two events at the Magic Kingdom, and even though I logged in about 15 minutes after the opening at 9:00 am, I was pleasantly surprised that it actually wasn’t difficult at all to get them this time…

I’ve been to a handful of these events since Disney started doing them back in 2016, but more often than not availability was always very limited and if you weren’t ready to jump on the email the second it showed up, you were plum out of luck.

Because if I remember right, the first few events didn’t have a scheduled reservation time, but just sent an email when it was ready … and sometimes it would come at three in the afternoon when everyone was at work … and others would get it a day or two later!

So I don’t know if maybe Disney just wised up and either beefed up the servers that host these sites or increased the capacity admitted to the events … or both. Even looking back further, it seemed like they were always having problems with demand for any Annual Passholder special events that they hosted as well … which always seemed odd to me for a $100 BILLION company like Disney to not invest more in their infrastructure to make their web experience absolutely flawless.

It’s really the same with the My Disney Experience app or guest wifi in the parks. I’m sure their hardware has to be beefy with possibly tens of thousands of concurrent wifi users online in a single park each day, but again … you’re Disney! You charge $100+ admission just to get in the door! Figure it out!

But just from a numbers geek’s perspective, it’d still be neat to know to what scale we’re dealing with for any of these problems – if the Magic Kingdom holds 100,000 people (hypothetically), do they let in 25% of that for the DVC events? Or any of the other hard ticketed events, for that matter???

It admittedly always amazes me how really any of the four parks can average at least 30,000 guests a day (according to TEA/AECOM’s reporting) and it doesn’t feel like you’re surrounded by such a sea of people because everyone is spread out across 100+ acres and eating/shopping/doing attractions to balance the crowds!

I think you could do some really fun stuff, and I’m sure that there are entire teams of geeks at Disney who do just that!, with access to the actual numbers of how many people walked through their turnstiles each day, or rode Space Mountain, or bought Mickey bars.

There’s a part of me that thinks it would be super cool if there’s a NOC-style room somewhere at Disney that has giant maps of each park up on the walls with heat maps showing guest flow around all of the lands and attractions.

They’ve got to be getting something more out of these MagicBands that we’re all wearing now!

Robots Playing Video Games???

January 13, 2019 12:50pm
Tagged with:

This is a really cool video of Super Mario World, and not just because it’s an incredibly fast speed run featuring all sorts of exploits that I’d never seen before, but also … the game is being played entirely by a robot!!!

So apparently what they’ve done here is hacked the SNES controller so that instead of sending inputs to the SNES via physically pressing buttons, they can have a computer send the commands … which is obviously much faster and can enable some pretty neat exploits that are (usually) too quick for a human to pull off.

For example, and I thought this was super interesting, apparently the fastest way to move Mario in Super Mario World is to rapidly change the direction that he’s facing as you move (i.e. pressing left, then right repeatedly). And we’re talking within a single frame of the game, so it’s not exactly something that a person could do, but when its just a computer sending signals to the console, it’s just another command.

Now as far as I can tell (and please correct me if someone reads this and sees that I’m wrong!), but the computer isn’t necessarily reacting to changes in the game – it’s essentially running a script that the designers have testing out and tweaked to get the best possible outcome, whether it’s triggering a Level Complete action before the level is actually complete by playing to an exploit in the game or defeating bowser at the end.

One of the reasons I’ve always enjoyed watching speed runs is because I think it’s incredible to see people play that have such a mastery of a given game that they not only know where every last item or power-up is, but also things like which order to kill monsters in because they understand how the game tracks those kills and determines what power-ups to drop next. So to then be able to take it another step and know what location in memory various statuses get held so that they can exploit weaknesses to make the game do things that it’s not supposed to – just wow!

Just one more example – here’s a glitch in Super Mario World that somehow triggers the end credits less than a minute into the game…

You can read the link above to explain the glitch better than I can, but it’s pretty cool stuff. 😉

© 1999 - 2020 Comedic-Genius Media, All Rights Reserved.