Nginx permalinks and search fixed!

So I realized the other day I couldn’t have nice permalinks for some reason so I set out to try to figure out why and/or fix it as it had all worked fine before I migrated from Apache to Nginx a month or so ago.  Unfortunately it seems like there were plenty of sites talking about a fix of changing the location block around a bit, however I quickly realized that while the suggested fixes would indeed let me change to non-default permalink styles it invariably broke the ability to search the website which kind of inhibits me using it for storing thoughts and fixed and things and being able to recall them with a quick search.  Finally I stumbled across the golden bullet for this problem and I will provide it so that hopefully nobody else spends an hour and a half of their Sunday messing with this when there are QSOs to be made!

Original location block

and now the updated one

 

And with a quick restart of nginx (that took a few tries because I’m too used to systemctl over service already) and ta-da I can search again and have the fancy pants permalinks that mean search engines can better index my ramblings.  Oddly enough as I was testing this I saw a spider doing its thing while I was looking at the access logs for the server, pretty interesting that a spider would be active on a sunday at like 1145 EST.

Network Manager and OpenVPN

It blows my mind that Network Manager is still as bad as ever, I just finished up getting my new phone aimed at the home VPN when I remembered that the laptop lost all the old settings in my switch to Fedora so I figured I would give it a spin and see if somehow NM had been fixed.  A few minutes and some profanity later and it seems it STILL is unable to properly load up .ovpn profiles and parse out the various bits into the fields they need to go.  Even when I manually split up the keys and certs and all that it only worked halfway, I could connect to the VPN but was unable to browse the internet over it or even access resources local to the VPN server itself.  Fortunately the command line comes to the rescue again, all I had to do was tell openvpn itself where the config was and it did all the legwork that the abomination known as Network Manager failed to do.  For those who might care the proper way to invoke it is as follows

Now I just have to make a handy way to suppress the output, give me a status indicator and kill off the connection when I am done with it…

Successful Upgrade is Successful

I would say I can’t believe I’m typing this from a successful full upgrade from Fedora 23 to 24 but I’m not since I am at work and they frown upon me pecking away on my personal laptop, but I am still amazed that it was an absolutely painless process to upgrade from 23 to 24 with dnf.  In prior years it was almost always advisable to reinstall rather than attempt an upgrade from one major release to the next but the fine folks over at Fedora seem to have hit a home run on this one.  Sure it took a while to apply everything but the moment of truth (or reboot) came and passed and all I got was my normal login screen, no fancy explosions of failed video drivers, no corrupted profiles or missing files; it went so smooth I almost didn’t think it worked until I checked the redhat-release file and verified that it was in fact on the 24 release.

Crontab – Always Check your Environment Variables

So I have been running into this issue for like a month now where a script that I can run from the command line by hand executes fine, but when I try to run it via a crontab job it just goes absolutely pear shaped without any real explanation.  Finally I got some time at the beginning of a shift to sit with one of our senior guys to take a look at it as the script provides data the entire team uses and when it doesn’t run they get cranky.  It turns out that the environment my cron jobs run as is highly different, as indicated by the following which is obtained by adding adding a line to output env to a text file every time the crontab job ran.

Compare that against the results from env when run by hand

Notice the path statement is very sparse when cron outputs the environment variables, turns out that anemic path lacked access to fping which was integral to my script building out a list of live hosts within our lab environment. Once that was fixed the cron jobs hum along nicely and churn out an updated map of the lab every hour without me doing anything and now I know that crontabjobs run with fairly different environment variables than scripts manually ran and can cause all kinds of havoc if you don’t use full explicit path statements in your bash scripts that you plan to automate.

scripting: system-help

We have this handy script at work that pulls all kinds of useful details from a system and saves us a ton of time checking by hand, so I took a stab at making my own version for generic use. Its not very good at all but it kinda works and probably could be expanded upon to do something actually useful.

Repo on Github

Pure Win – Monitoring Comcast’s Failures

So this is the greatest thing I have read in ages; for those too lazy to click a person having lots of problems with Comcast took it upon themselves to create a python script that ran a speed test and if certain conditions were met would send a tweet to Comcast complaining about not getting what they paid for.  Unfortunately there are lots of people getting paid to fellate Comcast and they flocked to the r/technology thread on Reddit to remind OP that he should be grateful for paying a pile of cash for “up to 150 Mbit” like he is somehow blessed to have the absolute worst company in the US as his ISP.  Naturally I grabbed the code and set it up on my development box to run every 15 minutes so hopefully in like a week or so I can generate a fancy graph to see just how bad Comcast is boning me on my already high monthly bill.

Snowmageddon 2016

So most of the east coast is currently buried up to its nether-bits in snow in case you haven’t been keeping up with things.  I figured this would be an ideal time to slap my newly purchased GoPro up in the window and let it take time-lapse pictures of the snowfall as it started to come down yesterday.  Unfortunately I found out after I had filled up the 32gb memory card that it was only able to take pictures at half second intervals so I didn’t get anywhere near the amount of data I wanted to create a cool looking timelapse of the snowfall, however I did mange to cover the first maybe hour or two of it and process it to a video for the enjoyment of the masses.

Strange copy behavior?

So a friend hit me up today to let me know he had updated a sqlite database that we use in a project and I could go ahead and copy it over to my home directory to update things with.  Login to the box, sudo to root and cop the file with full paths and something bizarre happened, the file which he had ownership of changed over to my user level account.  Immediately he suggested that it might be the -a flag in an alias, however my alias was simply set to use -i so I deleted the file from my home directory and tried the copy again.  As far as I can tell this shouldn’t actually be happening because I didn’t specify the -a flag and the user moving the file is root, so if anything root should have ownership of the file once it hits the directory.  I doubt this is any kind of nefarious or exploitable situation but it does seem strange because I remember forgetting to chown files in the past after moving them as root and things not working until I went back and corrected the ownership of the files

Troubleshooting Script

I have been planning to try to convince a friend to take up Linux in place of her aging Windows 8.1 system since its been officially put out to pasture by Microsoft so I started thinking about supporting said system since my friend is not exactly a Linux guru.  This of course lead me to think about a very handy script I use at work that does a whole bunch of things like check database settings, look for necessary running processes and look at system loads of our software to determine if there are any easy fixes before getting into logs and headaches.  Naturally I cannot share this script because its work related (and I didn’t write it either) but it made me decide to make a version of my own that I can show off.  Currently the script resembles a rather ugly gnome of some type in that it is short and VERY ugly but I figured I would toss it out so I can test a few other things like my nifty social share buttons that are about as basic as you can get without setting foot inside a HS Chemistry classroom.

Github/Gist

The long term goal for this script is to make it collect a whole load of system data, run some basic checks like pinging Google, and probably a few other things like making sure vital things like crontab haven’t been somehow deleted.  Probably will also make it verify that my ssh key is still active within the system so that if I have to I can remote into said system and might consider some sort of reverse ssh invocation as well if I really want to get fancy with the script.  If it saves me even 10 minutes when trying to fix the system that I haven’t even setup yet then my past hour or so messing with it and remembering all the things I’ve forgotten about scripting in the past few months was well worth it, plus it gets me back to actually posting something here for a change.

Cliche End-of-year Introspection

I was just kind of aimlessly scrolling around making sure things worked right after having to restore from a mangled updated and I realized that this blog has been up for five years now. The first post I made was in September of 2010, and since then I have managed a whopping 102 posts over the course of those five years. Doing some napkin math that works out to about one post every 17 days if things were spaced out evenly, though I suspect its anything but. Kind of surprising to think that 5 years have passed since I puzzled together how to slap WordPress up onto a Digital Ocean droplet and start spouting off my random stupid thoughts. Most days I don’t even think about the blog, hell most days I try not to think at all if I can help it.

It is however amazing to think that I have this little slice of the internet to myself to use as a personal soap box and sometimes portfolio of my various attempts at programming something interesting or useful. Back in the day it was an Angelfire website I posted Duke Nukem 3d maps to which then evolved into teenage angst on Tripod as I learned a little more about things like HTML and CSS while perpetrating absolutely horrifying graphic laden designs that have thankfully been forgotten for the most part. Along the way I have managed to learn a few things, like how to fill my basement with loud computers that sometimes do what I tell them and give me a place to try out things that might otherwise bring down actual production systems at work. I can safely say that I can write terrible code in PHP and Python both, and I bet I could make some awesomely inefficient shit in Go given enough Wild Turkey. I can sometimes understand the difference between DELETE and SELECT when writing MySQL queries and I understand that Salt not only makes my heart die a little bit but also lets me ruin multiple VMs at once in my lab.

So I guess maybe I have learned a few things over the past few years, but most days it sure doesn’t feel like it; I’ve been told that is a sign of actual wisdom but I’m not convinced just yet. Its far more likely that I am just a very clever impostor who has somehow wormed his way up from renting movies in a dying video store to being allowed to assume the mantle of Technical Support Engineer and get paid far more than my pathetic knowledge and lack of skill is really worth. Most of the time I feel tired, confused and quite often way more stressed than a single 32 year old male with no kids or pets should be, and unfortunately I don’t think that is going to ever change no matter how much money I make or fancy titles I get.