Troubleshooting page load speed, part 45…

December 20, 2016 3:31pm
Tagged with:

For the record, I’m painfully aware that the page load times on several of my sites have been, well, unbearable as of late!

Admittedly it’s a problem that I’ve been stalling on for some time now, despite the regular high load emails that I get from my server. A few days ago I got one saying it was as bad as a 5-minute load average of 68.94 … and this is on a VPS with three CPUs and 5 GB of RAM that does honestly pretty light traffic right now, unfortunately…

I had been hoping that most of it could be chalked up to an OS issue because I recently discovered that cPanel had stopped updating on me after reaching a version threshold where the latest would no longer run on 32-bit operating systems, which I was kind of surprised to learn that I had, but again, this VPS was setup back in 2012 so I suppose 32-bit could’ve still been the default four years ago.

The trouble is, there’s really no clean way to upgrade my server from 32- to 64-bit leaving everything intact, so it requires spinning up a new machine and migrating everything over to the newer OS.

Plus, the way I migrated four years ago to VPS from my plain, old shared hosting account of maybe eight years was using cPanel’s built-in account transfer features, which although made it incredibly easy (plus my host did most of the work!), lord only knows how much random garbage has accumulated in all of those files over 8 + 4 years of shared history!

So I had planned on making the migration sort of a clean-up effort along the way and only copy over the guts of each WordPress install, leaving behind whatever caches and other nonsense have accumulated over the years.

And then terrible performance got even worse!!!

When it got to the point where it would literally take upwards of a minute to move from one page on my blog to another, and the admin pages would randomly get disconnected because they couldn’t touch base with the server when it was super overloaded, I knew that it was time to finally tackle this pig. So within a few hours time, I created a second VPS with my awesome web host and gradually let it go through all of the OS and app updates while I staged just one install – my multisite that contains my blog, Thing-a-Day, and about half a dozen other sites – and everything seemed to be going fine…

…until I switched my domain over to the new server…

…upon which usage started blowing up like crazy, again despite little traffic, and even though I started this new VPS a bit smaller than my main server (figuring I could upgrade once I’m ready to stop paying for the old one), it quickly became unusable just like the old machine had been.

From here I started doing some digging into WordPress itself because no longer could I point fingers at the 32-bit OS. I downloaded a couple of plugins – P3 Profiler and Query Monitor – and with the latter’s help, that’s when I noticed that apparently I had a plugin that was just GOING NUTS against MySQL day and night:

To walk you through this fun graph, the orange is your normal SELECT queries that happen when anyone hits a page and WordPress queries its database to build it; the purple, on the other hand, is somehow all INSERT queries, which should really only ever happen when I’m publishing a new post, with a few exceptions.

And those two big valleys in the purple that you see around the 18th and then again between 19 and 20? The first is when I had temporarily pointed my domain over to the new server; the second is keeping the domain back on the old server, but turning off the plugin in question … which apparently solves just about everything!

By the way, the last little purple sliver is me turning the plugin back on so that I can capture some logs to send over to the plugin’s developer to help him look for a fix…

because the thing is, I actually really like this plugin – it’s called WP Word Count and I’ve used it on just about all of my sites for years and years to summarize how much writing I’ve put into my projects. I love it, and if I can spare the time next year, I want to make use of its work to pull together a running total of word counts for all of my work combined so that I can ultimately put together a fun, little dashboard to help motivate me to keep putting words on the screen!

Luckily after finding another multisite user with the same issue, I left a quick comment expressing the same and got a reply from the plugin’s developer later on that evening, so it’s awesome that they’re actively interested in patching this bug because I’ve evaluated a lot of other options and honestly never really found ones that I liked better than theirs.

In the meantime it’ll have to stay off, though, as I continue with my fun server migration. During this whole effort, I’m also trying to really hone in on The Perfect VPS WordPress Configuration, so I’m doing other things like tinkering with Varnish and considering adding Nginx in front of Apache, and then eventually I also want to fine tune W3 Total Cache so that I have one reliable set of caching / compression / minifying rules to use for all of the different sites that I publish … because I figure if I’ve seriously been doing this publishing on the web-thing for over fifteen years now, my pages should be some of the fastest loading around!

Stay tuned for more as I struggle through more issues to come! Now if I can only get this stupid thing to post… 😛

I like pizza, but I don’t need coupons for it every single day.

I noted the other day how it kind of bugs me to get promotional emails from likes of Pizza Hut and Domino’s several times a week when in reality I don’t order out quite that often, and even if I do it’s certainly not from all of the places at once!

Sure, I could go to RetailMeNot and look for deals there, which works for some eateries, but you’ve also got rewards programs – particularly Red Robin comes to mind – who actually require you to be on their mailing list in order to participate and get free stuff from their rewards programs.

So after mulling it over for a couple of days, I finally sat down and setup a couple of filters in Gmail to do exactly what I wanted to do – route my preselected ad emails to a separate folder (or label in Gmail), and then when anything gets older than 45 days, automatically delete it. The idea is that now my promo codes will be there when I want them, but I don’t have to waste any time sorting through them when I don’t.

Here’s what they look like:

Filter #1 routes any emails from the specified addresses to my new Ads label so I’ll never see them in my regular inbox…

(note: use curly brackets {} around multiple addresses to specify OR instead of AND)

gmail_ads1

Filter #2 then does the dirty work and cleans up afterwards, monitoring for any old emails and deleting them once they’re 45 days old because more than likely any promo codes within will have expired by then anyways…

gmail_ads2

PROBLEM SOLVED – that’s one less thing cluttering my inbox without actually unsubscribing from any lists!  8)

A Verizon FiOS Upgrade Update … of Sorts

November 18, 2015 2:36pm
Tagged with:

So here we are, 10 days after my rant about issues with getting my FiOS Internet speed upgrade.

The good news is, I’m officially running at the 150 Mbps that I originally wanted … sometimes.

The bad news is, I’m not super crazy about what I had to go through to get it.

Here’s a quick timeline…

  • 10/23 – Found upgrade options missing online; no luck with support via phone or Twitter.
  • 10/29 – Sent an e-mail pressing further, response back that it was a mgt. decision.
  • 11/12 – Sent an e-mail to Verizon’s head over FiOS, got a response from his office in a matter of hours
  • 11/13 – Spoke on phone with exec. customer support who overrode issue and scheduled my upgrade
  • 11/16 – Tech came out, did install

It turns out that the final version of the story that I was given is that Verizon ran out of equipment nationwide, so in the meantime they decided to limit this particular upgrade to new customers until they were able to get their hardware issue under control. I didn’t ask if it was related to next year’s sale to Frontier because I wouldn’t blow a lot of money on equipment if I was selling the business soon, but it frankly wouldn’t surprise me…

Anyways, the gentleman from the VP’s office was very polite and offered to honor my upgrade by first submitting an order to upgrade me to 300 Mbps to get my order through the system, then coming in afterwards to back it down to the 150 Mbps that I actually wanted. And I did get my router included, though there was a one-time $150 install fee which I was honestly fine with at this point.

The install itself went super smooth – better than most, in fact – because instead of running a new ethernet line through the attic from the ONT to my router, he was able to make use of an existing line that ran to a smart panel in my closet where the router now resides anyways, so all in all we were probably done in about two hours. Speed tests were a little bit of a pain just because only my server is currently hardwired and it doesn’t have Flash installed which 99% of the speed tests require, but we worked it out nonetheless…

speedtest300 speedtest150

It was definitely hard to say goodbye to that 300 Mbps, though in no way can I justify another $90 on top of what I’m paying already, plus in reality I did find that the places I normally download from couldn’t push more than about 200 Mbps at me at a given time anyways … which was kind of expected. So it’s certainly worth noting that as sexy as the prospect of gigabit ethernet is, it’s really only useful for multiple devices pulling at the same time at least for the time being.

Still, this effectively doubled the speed in which I can download movies and TV shows, so that’s cool!

Also, my ping time is roughly 1/4 of what I was used to seeing, so also cool.

That said, even in just a day I’ve noticed my speed wobbling a bit – sometimes I can get the full 150 Mbps, sometimes it clocks in less than the 75 Mbps that I had before … not sure if that’s just standard Internet congestion (though I didn’t see it much before) or if moving me over to GPON puts me on a busier node where I’m competing for bandwidth more than I did when I was on BPON. Will have to keep an eye on that…

Anywho, at the end of the day my only real complaint is simply that I had to jump through so many hoops to get where I am today. One typically shouldn’t have to complain to a VP in order to get their service upgraded, and better communication at any step in the chain would’ve at least calmed me down and made me a little more understanding. I guess the moral is if you’re not getting anywhere with customer service, just go straight to the top and try there instead … which is terrible advice, really, but it seemed to work here.

I made the link above to said VP’s reference clickable just in case anyone else has the same problem… 😉

Digging deeper into server issues…

October 29, 2015 11:49pm
Tagged with:

I think I’m making progress, albeit in a number of avenues that I wasn’t necessarily expecting!

One thing standing out that is seeming to be more of a problem than I would’ve thought is that simply put – I’ve got a lot of WordPress installs on this server!!! Sorting through them all, I came up with something like 20 installs in total, which is amusing because I’ve only got like 10 domains currently registered … apparently I’ve built up a healthy list of test installs in addition to some really old ones that I never post to and thus don’t really think about.

Now I wouldn’t have thought this to be much of an issue until I was able to start digging into specific processes running slow along with their associated URLs and database queries, and it turns out that WordPress’s own cron function has been at least part of the source of my problems, for a couple of different reasons:

A) Across those 20 installs, a number of them weren’t up to date – some of them being so old that I had to update them manually (gasp!), and more prominently some outdated plugins that either also needed to be brought current or in some cases removed altogether for instances where I’m not even using them anymore (i.e. I used to heavily use a backup to Dropbox plugin, but I’ve since come to rely more on system-wide backups and I don’t even have Dropbox installed on my laptop today).

B) Also, I still need to learn more about how WP-Cron actually functions, but I think there may have been some cases where many sites were trying to all run at the same time, which is just a general recipe for disaster! From what I’ve read so far, it sounds like WP-Cron actually triggers based on page views … which confuses me how my test sites were even triggering … but one option here might be to disable WP-Cron and instead tie them into real cron jobs at the OS level so that I can scatter them throughout each hour instead of triggering arbitrarily.

I’m not entirely sold on just yet, but realizing that I have so many separate installs definitely reinforces my curiosity around WordPress multi-site, which I was playing around with earlier this summer and actually abandoned when I decided not to redesign some of my sites. But from a resource management perspective it still might make sense, even if I just try to pull some of the like-minded sites into one install, or possibly even a bunch of the test sites, just to help get the numbers down!

All in all it’s a work in progress, but so far my last high load notification was at 5:50 pm last night and I updated a bunch of my installs in the hours since, so hopefully I’m starting to work through the issues and load is at least going down a bit! Mind you, it doesn’t help that I don’t really know what’s a reasonable daily page view volume that my current server should be able to handle … and granted, now that I’ve started playing around with routing some of my traffic through Cloudflare, I’m sadly reminded about how much junk traffic a server gets that never even becomes legitimate pages served (like brute force attacks, scanning, etc…).

One other tool that I’ve found that’s been helpful specifically in pinpointing the cron issues has been New Relic, which is actually a probe that I had to install under PHP on the server itself but then in turn monitors all sorts of neat stats about processing times and whatnot. I’m just using the free Lite version they offer now and it’s already been enlightening – definitely worth checking out!

Screen Shot 2015-10-29 at 10.30.19 PM

A Look at Web Page Performance…

October 28, 2015 5:49pm
Tagged with:

I’ve been experimenting around with performance tuning on my web server the last couple of days to try and work out some bizarre, high usage issues when (unfortunately) in reality none of my sites really garner that much traffic to warrant the spikes that I’ve been seeing.

Some of it is common sense stuff like troubleshooting slow-performing WordPress plugins – for example, apparently W3 Total Cache was dragging down my response times even though it wasn’t active at the time, which lead me to reinstalling and then actually setting it up correctly because I think I disabled it a few months ago out of sheer frustration.

I also made some tweaks to my Apache/PHP build, thus resulting in my having to rebuild no less than a dozen times last night each time I’d find a new option that I could only enable by rebuilding! So if for some reason you found one of my sites down last night, that would be the reason why… 😛

In the midst of all of this, I’ve also been trying a number of different web page speed tests to try and gauge my progress through the whole mess – Google PageSpeed Insights is usually my go-to for general tuning, but I also like WebPageTest.org because it gives me waterfall graphs of every single element that needs to be loaded, and I also recently discovered GTmetrix, which is cool because it will run several tests at once and gives you the option to see results for multiple sites side-by-side … something I normally have to do in separate windows.

Anywho, one of the views that I found interesting from WebPageTest.org is where they breakdown the actual requests and bytes coming from each individual domain because obviously the lower # for either of those stats, the faster your page will load. Below is what Just Laugh’s homepage looks like…

Screen Shot 2015-10-28 at 4.50.58 PM

What’s interesting here is that really only a select few of these domains are related to actual content – primarily justlaugh.com and then wp.com because our images use WordPress.com’s CDN via the Jetpack plugin.

All of the Facebook hits are for the single Like box in the footer, and the same with Twitter.

We also have a single ad widget for Amazon, along with a couple of Google Adsense units, and then we use both Google Analytics and WordPress Jetpack for analytics.

So really, totals breakdown something like this…

  • Content – 75 requests for 509k
  • Social Media – 51 requests for 667k
  • Advertising – 51 requests for 634k
  • Analytics – 10 requests for 18k
  • GRAND TOTAL – 187 requests for 1,828k

Now don’t get me wrong – there’s certainly value that comes from each of those other three sources otherwise I wouldn’t use them in the first place, but it still says something interesting about web content in our times when social media & advertising tie-ins together make up more than double the actual real content that a website has to offer to drive those other things in the first place! And before you say that it’s really kind of my fault that the breakdown is like this because I designed the site, I would add that really, Just Laugh is extraordinarily conservative with regards to advertising compared to other media sites that still use pop-ups and wraparounds and those god-awful Taboola ads that currently infect 80% of the web today.

Of course, the real exercise here is simply how to improve on these numbers, which is tough because most of these requests are still measured in milliseconds and many are done in parallel. The whole page currently takes right around 10 seconds to render, which in some ways seems terrible but in comparison with sites like CNN and The Onion it’s actually about right in the middle.

Could I shave off a couple of seconds by eliminating the Facebook and Twitter widgets, or possibly even the Amazon widget???

Possibly, but would the savings really be worthwhile in the bigger picture when gaining Facebook and Twitter followers is also a worthwhile goal???

Clearly I have no idea, but it’s always fun to have real, actual data to wade through to consider things like that!

On the other hand, at least I can say for a fact that my caching is now working correctly because for repeat views, all of those 3rd party requests are pretty much the only things still getting reloaded… 😀

Screen Shot 2015-10-28 at 5.48.08 PM

Speed Testing via the Linux Command Line

October 7, 2015 4:49pm
Tagged with:

Last night I relocated a bunch of computer stuff – namely my home server and router – to our bedroom closet, which in a positive way got it more up and out of the way so that we only have to listen to fans spinning when we’re picking out clothes to wear, but in a not-so-positive way, it means that at least until I climb into the ceiling to run ethernet cables around the house, my rig in my office will be relying on wifi instead of a wired network connection for a while.

Now this didn’t really seem like much of a big deal until this morning I noticed that Verizon dropped its prices on the higher Internet tiers and now upgrading to 150 Mbps is only an extra $20 instead of $50!

And mind you, I don’t necessarily need most of that speed here at my desktop, but I am somewhat addicted to speed tests just to randomly remind me how awesome my Internet connection is these days, and not for nothing but speed tests over wifi kind of suck.

That said, my home server is still hard wired because it’s literally sitting right next to the router in the closet now, so a bit of quick Googling found this nifty post that provides a great walkthrough of how to run speed tests directly from the command line in your friendly, neighborhood linux box…

I already had Git installed, so it was maybe 30 seconds to pull down the speedtest-cli script and copy it into /usr/local/bin, then I was off to the races! I’m pretty much a sworn user of Speedtest.net, so to see that it was interfaced directly with them was an easy win. And the customization is neat, too, how you can either run in a default for the fastest host or choose your own, in addition to getting the link for your results badge to wear so proudly.

My favorite feature, though, is how simple they made batch testing so now you can actually pick multiple locations around the world and kick them all off in rapid succession. Though normally I default to my web host up in New Jersey because I think testing with a local server here is stupid when we don’t really have a lot of data centers here for major websites anyways, they were admittedly running a little slow this morning so it was neat to be able to also throw in LA and Miami as two other corners of the country to help round my test results out!

Screen Shot 2015-10-07 at 4.39.30 PM

Now to see if I can find that promo where they were giving away the free router to upgrade to 150/150… 😛

Plex Streaming 101

October 6, 2015 4:56pm
Tagged with:

For Plex proponents like myself, this video is a nice, simple walkthrough of the ways that Plex Media Server streams media to different devices both in your home and abroad.

I’ve been lucky up to this point in that the main devices that we use Plex on at home are two Samsung TVs and Plex Home Theater on my computer, all of which support DirectPlay. So far this has been a good configuration for us because even adding my sister-in-law remotely who occasionally needs to transcode due to receiving Internet speed, it hasn’t really affected simultaneous playback here at home. Nonetheless, I can see beefier processing power in our future eventually to help from buffering if we add any more relatives connecting in the same manner who all want to watch at the same time.

But hopefully by then I’ll have justified the bump up to a swanky rack-mount server boasting a sexy RAID configuration and a new motherboard that supports multiple processors! 😉

So I stumbled across this website called the Library of Babel last night, and it’s kind of freaky.

Essentially they’ve created an algorithm that has created every combination of letters … ever. Or at least up to 3,200 characters, for starters. But it’s all indexed, so whatever you type, there’s a page in this vast library that already says whatever you were going to say…

Like – this last paragraph that I just wrote – it can be found here:

Screen Shot 2015-10-04 at 3.30.22 PM

Or even just completely made up nonsense that’s disappointingly not actually true:

Screen Shot 2015-10-04 at 3.33.57 PM

Apparently the site is based on a short story by an author from Argentina written in 1941, well before the Internet was ever a public notion, which is kind of crazy to think of the notion prior to the architecture being available to actually make it a reality … a futurist in the true sense of the word!

Now granted, despite having a computer that can literally generate any text that could ever be conceived, it still takes the creativity of humans to bring the next Harry Potter or Lord of the Rings to be consumable by mankind … the crux of having everything is that you’ve got the literary classics surrounded by an almost infinite amount of garbage unless you already know what you’re searching for.

Even looking at only samples of 3,200 characters, the library currently contains 104677 books of information, whereas there are estimated to be approx. 130 million books published in modern history today … to say that the meaningful texts available represent only a fraction of a fraction of the everything that this algorithm creates…

…but it’s still kind of a neat concept from a technical perspective, nonetheless.

20150831_025533935_iOS

Just did an experiment because Plex videos have been buffering more than usual, especially some new home videos that I added tonight from our phones, so I ran a cable from the TV in our living room directly to our router to see if it really is the wifi bogging down.

Verdict … yes and no.

Random stuttering seems to have improved and I was able to play a couple of key movies that have given me troubles in the past without a sputter in sight. That said, apparently there’s a known issue with Plex sending .MOV files to Samsung TVs on the home movie side, so I’ll have to do some more digging there…

I also learned – because I had to move my router into the spare room next door where the server is for the cables to all reach – that apparently the coax jack in that room isn’t live after all! Wonderful. Thankfully I had a piece of coax long enough to run around the corner from my office, but that’s one more thing I’m going to have to look into when I’m poking around the attic, which is now going to have to happen sooner rather than later on account of the ethernet cable now running overhead in the kitchen!

I think I need to get one of those dorky headlamps to wear before I plan my adventure upstairs… 😕

Home Tech Talk, Part 2

August 30, 2015 9:07pm
Tagged with:

So picking up where I left off last night, I think these are going to be my next points of focus around the house…

Run Cat6 to the Living Room and Bedrooms
Not really looking forward to this because it means I have to go up into the attic and I’m always afraid of falling through the ceiling, but I’m figuring that wiring up these two rooms so that Plex doesn’t have to run over the wifi is right now probably the easiest thing I can do for the occasionally freezing that we see during playback.

Run Cat6 Out to the Garage for FiOS
Right now we have 75 Mbps symmetrical service, which is awesome, but to upgrade to anything faster we actually have to move away from using coax and run ethernet direct from where the fiber comes into the house to our router. Verizon claims that they’ll do this themselves during the upgrade, but who knows what the quality will be, and if I have to be up in the attic for wiring the other rooms anyways, it makes sense to just do this one at the same time.

Improving Backups on My Local Server
Part of this will be easy and part will be a pain. The easy part … right now I have a local install of CrashPlan pushing something from the server into the cloud, but I honestly don’t even remember what … it could just be test files from when I first got it setup, for all I know! So I just need to review and update the backup plan so that it includes more of the things that it should – Plex’s database and local settings, and some other random stuff sitting there that I want to preserve. It’s probably still not appropriate to push 20 TB of media files into the cloud, but I’m cool with sending 100 GB of music files for now.

The less easy part is facilitating backups from my web server that hosts all of my sites from an actual data center. They technically do regular backups within their own network at no charge to me, but just to be on the safe side I wouldn’t mind pulling down another copy of everything to keep archived here just in case. It’s only like 15 GB for everything anyways, so it’s not a ton of files, and I think that WHM even has an option to FTP another copy of the backup set to another location when it’s running. I just need to take an afternoon to figure it all out and get the thing working…

Organizing and Sharing Photos
Now with bringing my wife’s iPhone/iPad photos into the mix, we’re sitting on a collection of something like 40,000 photos over the last 15 years and as unmotivated as I am to print them out and fill photo books, it’s still fun to flip through them online so I think I want to finally figure out a manageable way to put them online for regular browsing. Not sure if it’s going to be Flickr, or a self-hosted WordPress install, or something else … I guess that Plex has options for Photos & Home Videos, too, so this one’s still wide open, but it’s something I’d like to do before we get up to 50,000!!!

Automated Christmas!
And the last one is something that I eluded to yesterday, and tonight I found another cool option, but their beginner set still costs upwards of $500 so it’s going to be a while before I can splurge here and it still might not happen this year. Still, I love the idea of a truly customizable Christmas light display where not only can I make the lights dance to music, but can even choose the color of each bulb as well, so no more fighting for the right combinations of colors at Lowe’s and Walmart and wherever else my lighting takes me…

…although at &^(%% per string, they’d better turn any color you want them to! 😯

© 1999 - 2018 Comedic-Genius Media, All Rights Reserved.