This is Why We Can’t Have Nice (Free) Things

There was a little internet kerfuffle last week when Matt Mullenweg from WordPress correctly pointed out that Wix was violating the GPL. Now, he did it in maybe not the nicest way (“If I were being honest, I’d say that Wix copied WordPress without attribution, credit, or following the license”), but at it’s core, his argument was true.

A core part of Wix’s mobile editor is forked from WordPress’ GPL licensed editor library.

And that’s pretty much all there is to it. In the end, if you use something that is GPL’d in your application, you walk a fine line of needing to open source and offer your source code under the GPL as well. The GPL is a viral license (GPLv3 particularly so), and including code licensed under it is, at best, something you should do with a close reading of the license. At worst, you simply just shouldn’t include any GPL code.

Wix’s CEO posted a response and completely missed the point. As did one of their engineers. They both seem to think that intent matters. While it does matter in that it helps us understand that there was probably not any malicious intent, the GPL is the GPL and it’s pretty clear.

As Daniel Jalkut says:

if you want to link your software with GPL code, you must also make your software’s source code available … You have to give your code away. That’s the price of GPL.

Many developers understand, and view the price of GPL as perfectly justified, while others (myself included) find it unacceptable. So what am I supposed to do? Not use any GPL source code at all in any of my proprietary products? Exactly. Because the price of GPL is too much for me, and I don’t steal source code.

In my office, we’ve basically made the same rule. Even though we don’t ship code, we still stay away from GPL’d code as much as possible, simply to avoid any chance of impropriety.

I look at the GPL like Dave Matthews Band. It sucks, there’s lots of other licenses just like it that are much, much better, and it’s fans are so annoying as to make it that much worse.

The Olympics and Blaming Millenials

(Note: I’m not sure why these two articles bugged me so much. But they did.)

There was a somewhat poorly written (or, at least, poorly titled) article on Bloomberg (shocker) about the down ratings for the Olympics on NBC. In typical Bloomberg fashion, it’s a clickbait title (who can resist blaming millenials), as the article itself points out that it’s the 18–49 demographic that saw ratings down (with no breakdown inside that demo to determine where the real drop was).

And in the 18-to–49-year-old age group coveted by advertisers, it’s been even worse. That audience has been 25 percent smaller, according to Bloomberg Intelligence.

In response, a millenial (presumably) attempted to defend his peers and lash out at NBC (though, really, it was more about the cable industry) and the inability for the cable/broadcast industry to meet the needs of cord cutters.

The issue I have with the article isn’t so much the argument. I agree that the cable and TV industries are going to have to change the way they think about the broadcast model. And, while it may not be changing as fast as we’d want, it’s changing incredibly fast!

Think about that ten years ago, being a cord cutter meant using an antenna, borrowing DVDs from the library, and maybe downloading a show from iTunes.

Today, you could conceivably use Hulu, Netflix, HBO, Sling TV, and iTunes, and probably cover everything except live sports. And ESPN may be going over the top in the next year. That’s progress.

The article, however, takes almost 1800 words to complain about how difficult it was to watch the Olympics online without a cable subscription and then complains about too many advertisements and the lowest common denominator announcing during the opening ceremonies, as if these are new things. And, while cord cutters are a growing audience, it’s still something like 80% of households who have cable. Cord cutters alone didn’t cause the audience to drop.

No, it’s not until the last segment of the article, which mostly hits on what I believe to be part of the real reason for the ratings being down:

It opened with Simone Biles and Co., but then, despite being filmed earlier in the day, inexplicably goes from the earlier rounds of Gymnastics to Swimming. Hours pass before we finally get to see the resolution to those Gymnastics rounds

The ratings were down because NBC couldn’t figure out how to show events in real time to both the East and West Coasts. With clips showing up online, on ESPN, on Twitter, the audience&emdash;millenials or not&emdash;couldn’t be bothered to stay up until 11:30pm ET to watch the gymnastics finals that had already happened that day. Or worse, for the half of the country on the West Coast, that had happened many hours before.

NBC’s real crime is not figuring out how to get more of the core live events in front of the audience when it was live. Live sports are the only thing left that really can keep audiences from cutting the cord, and NBC (while well intentioned with their wall-to-wall online coverage) forgot that.

In the end, Bloomberg incorrectly blamed millenials, and, in turn, millenials (or this millenial responded in the stereotypically myopic millenial way.

ScanSnap Directly to the Cloud

Last week, Fujitsu added an awesome feature to their ScanSnap scanner line (at least, the iX500 that I have). You can set it up so that, rather than having to have a machine on the same wireless network to pick up the scanned documents, the scanned documents just get shipped to your Dropbox or Google Cloud.

That let’s you do some really interesting things. You can run Hazel rules on your Dropbox folder, just like you can on a local folder, to do automatic sorting, naming, etc. on your machine. You can also do some interesting automation things with IFTTT to trigger other types of activity based off of files getting scanned. Or some combination of both (you scan some sort of receipt, it’s automatically filed into a folder via Hazel, which also triggers an IFTTT action to send an email to someone telling them that receipt is there).

The cloud feature seems small, but it’s a huge improvement to the convenience of what is already a device that has made my life a lot simpler.

Let’s Encrypt SSLs

A couple of months back, I went through the process of trying out Let’s Encrypt to setup some SSL certs for my various little sites. Do my sites really need encryption? No. But, at this point, it’s easy enough to setup an SSL cert, and I’d rather my sites pass their data securely, even if no one cares what goes on between my site and your browser. I’m not storing credit cards or capturing info about my visitors (beyond the analytics Google captures), but in a world where the government is increasingly looking for ways to get at the data of citizens, why not do it.

Plus, it’s free.

It’s a little bit of a challenge to get setup if you’re not already used to mucking around with server management. The newer versions (as of this moment, 0.5.0) make things much easier, but you’re still going to need to be at least familiar with git, python, and sudo.

Once you’ve gotten certs and gotten your servers configured, you just need to remember that these certs expire every 3 months, unlike yearly (or longer) for more traditional certs. Currently, you’re on your own to renew them, but it sounds like they’ll be building out renewal scripts to make it easy.

SSL certs are already reasonably inexpensive (providers like Comodo often sell them for less than the cost of your annual domain renewal), but the ability to get certs for any number of subdomains for free is pretty compelling. Once the automation is in place, they’ll be almost no reason to run a server without https.

(Of course, Let’s Encrypt could be a big government ploy to get everyone to install free certs that they have the key to, and they’ll be able to eavesdrop on all of us with ease.)

Microblogging

Manton Reece has been working on an app/business/service that I think is really in the “own your own Twitter” space. Basically, why not own your own work, rather than just pushing it into Twitter.

It’s something I’ve thought about in the past. If I could post to Twitter and push those to my blog at the same time, it’d give me a full accounting for most of what I do on line (suck in Instagram, and you probably get the totality of it).

I’m interested to see what he comes up with. I think, often, that my Tweets only make sense in the context of the moment. A Celtics game or a concert, or whatever is happening on TV. Some are of the moment in a world sense, and make more sense standing alone.

For example,

Serial is pretty popular, so that stands up on its own alright (and, for fun, go search Bergdahl and Rand on Twitter. It’s amazing.)

This tweet, however,

only makes sense when you realize I was at the Celtics/Clippers game before the All Star break, that the Cs pulled out in overtime.

If you push your tweets/microposts to your blog, even if it’s within the context of your other tweets/posts, can you maintain that context of the moment? I’m not sure.

It’d be amazing if, whether via an app or later inside of your blogging applicaiton, you could add that context. If I could post from an app, that knows my location, and can determine I’m at the Celtics game, and add enough meta-data to that tweet to put it in the context of “Posted from the TD Garden during the Celtics victory over the Clippers”, that’d be pretty amazing.

And it’s not really out of reach today. That tweet could have had geo-data, which would put me at the Garden, during the time the game was going on. I mentioned “game”, which likely narrows the context down even further. If an app/web service could even let me go through my tweets later, tag them with context, and have that flow to my site, that would be a pretty amazingly wonderful service.

AirSonos on the Raspberry Pi

I just posted about my little Raspberry Pi server.

The other thing I’m running on it currently is AirSonos. We love our Sonos Playbar sitting beneath our TV. We use it all the time.

But it doesn’t support Airplay, and sometimes you want to use Airplay. I’ll get home from work listening to a podcast on Overcast. I walk in and want to play it on the Sonos while I clean or cook dinner. I can use headphones. I can turn on the TV and Apple TV, and Airplay it to the Apple TV to listen to it through the Sonos.

Or, using the little raspi home server running AirSonos, I can now Airplay it directly to the Sonos. It’s pretty awesome. There’s a little lag when you start it up, but once it gets going, it works swimmingly.

The little raspi is turning into a wonderful addition to our home. I find new uses for it every day (maybe this is next).

My Raspberry Pi Home Server

A month or two ago, I saw a link to Nick Farina’s awesome little node service Homebridge. Homebridge allows things in your house that don’t work with Apple’s HomeKit, say a Nest thermostat, to work with HomeKit. HomeKit enables you to do fun stuff like “Siri, set the temperature downstairs to 66 degrees.”

You know, really important stuff.

I’ve been trying to reign in our power use. We have laptops and iPads and iPhones and a couple of TVs and a WiiU and XBox and DVR etc. That’s a lot of juice. I’ve replaced all (well, most) of our lights with LEDs. I’ve played with power settings and anything else I can find to try to reduce our overall power usage.

The last thing I needed was to run my iMac full time as a home server.

Enter the CanaKit Raspberry Pi.

I’d been looking to muck around with a Raspberry Pi (from here on out, a raspi) for a bit, but never had a good reason to. Here’s a perfect use: a super low power, tiny, always on home server.

I got it last week and spent a few hours getting it configured. Then I setup Homebridge.

(After mucking with my network and nearly breaking everything … ) It all worked.

Homebridge has a bunch of plugins. Our Nest thermostats were added, but I also added our Sonos. And, eventually, I’ll add other devices (I have a Twine in our basement keeping an eye on the temperature – I may work out a way to scrape it’s data and push it to Homebridge).

It’s not the greatest thing in the world, but there’s something nice about being able to tell Siri to turn the temperature down. If I get a smart plug, I could tell Siri that I’m going to bed, and have it turn off the living room lamp, turn the temp down, and (with a little bit of work), maybe even have it turn off the TV.

That’d be pretty cool. And it all runs off a server the size of a couple of packs of cards that makes no noise and probably costs < $10/year to run.