If You Aren’t Already Using a VPN, Time to Start

I mean, everybody wants to make sure their ISPs can sell their data, right?

I was particularly saddened to see Rep. Massie on the list of those voting for this measure. Having worked for him (years ago), he is certainly smart enough to understand the technical implications here, but voted out of the idea that the free market was already doing a good enough job of this (i.e. Comcast won’t sell your data without your permission, for fear that you’ll leave for a competitor).

The problem is that, in great portions of this country, there’s no free market for ISPs. In most locations, it’s a local monopoly. I’m lucky: in my city, we have two cable providers, plus high speed fiber (fios). In the town I grew up in? One cable provider. And then DSL, if you live in the right spot. The house I grew up in? No DSL. No options.

Anyway, use a VPN. Most sites are using HTTPS these days, which is helpful, but your ISP will still know what name you looked up, what IP came back, and how long you were on the site. If you want to be careful, switch to an open DNS provider, and use a VPN. Most DNS providers will also use your data, but they will at least give you the option to opt-out. (As backwards as this sounds, I’d recommend Google Public DNS).[1]

For VPN, both Cloak and TunnelBear are reasonably cheap (probably less than you pay for 1 month of internet) and easy. Or, if you’re so inclined, roll your own.


  1. Google’s DNS privacy is pretty clear—“We don’t correlate or combine information from our temporary or permanent logs with any personal information that you have provided Google for other services.”  ↩

Was Sun One of the Powers That Created Captain Planet?

It took about 4 months of back and forth and permitting. Two and a half days of actual work on the room. A couple of visits from a friendly inspector to make sure everything was kosher. And, finally, a 30 minute visit from a nice tech to setup the wifi.

In the end, we’ve got an array of 26 solar panels producing energy on our roof (and setup in a location that you don’t really see from the street).

Screen Shot 2017 03 27 at 4 27 54 PM

Unfortunately, we’ve only had a couple of sunny days since then, but on a cold, but sunny, day in March, they produced about 40 kWh of power, which I think is more than what we’d use on a normal day. It’ll be interesting to see how we do in April and May. I’m optimistic this will have really nice returns for us.

So far the only real issue has been the monitoring software, Enlighten from Enphase. When working, it’s really nice. But, while my end is reporting pretty regularly, the website seems to go long whiles between updating. And, over the weekend, it just seemed flat out down. I’m hoping I can figure out a way to pull info from it. It looks like there’s an API, so I might be able to wire up a Homebridge plugin to pull data from it and then list usage on my HomeKit apps.

(And, no, Sun wasn’t one of Captain Planet’s people. Earth, Water, Wind, Fire, Heart. I guess maybe Fire counts?)

This is Why We Can’t Have Nice (Free) Things

There was a little internet kerfuffle last week when Matt Mullenweg from WordPress correctly pointed out that Wix was violating the GPL. Now, he did it in maybe not the nicest way (“If I were being honest, I’d say that Wix copied WordPress without attribution, credit, or following the license”), but at it’s core, his argument was true.

A core part of Wix’s mobile editor is forked from WordPress’ GPL licensed editor library.

And that’s pretty much all there is to it. In the end, if you use something that is GPL’d in your application, you walk a fine line of needing to open source and offer your source code under the GPL as well. The GPL is a viral license (GPLv3 particularly so), and including code licensed under it is, at best, something you should do with a close reading of the license. At worst, you simply just shouldn’t include any GPL code.

Wix’s CEO posted a response and completely missed the point. As did one of their engineers. They both seem to think that intent matters. While it does matter in that it helps us understand that there was probably not any malicious intent, the GPL is the GPL and it’s pretty clear.

As Daniel Jalkut says:

if you want to link your software with GPL code, you must also make your software’s source code available … You have to give your code away. That’s the price of GPL.

Many developers understand, and view the price of GPL as perfectly justified, while others (myself included) find it unacceptable. So what am I supposed to do? Not use any GPL source code at all in any of my proprietary products? Exactly. Because the price of GPL is too much for me, and I don’t steal source code.

In my office, we’ve basically made the same rule. Even though we don’t ship code, we still stay away from GPL’d code as much as possible, simply to avoid any chance of impropriety.

I look at the GPL like Dave Matthews Band. It sucks, there’s lots of other licenses just like it that are much, much better, and it’s fans are so annoying as to make it that much worse.

The Olympics and Blaming Millenials

(Note: I’m not sure why these two articles bugged me so much. But they did.)

There was a somewhat poorly written (or, at least, poorly titled) article on Bloomberg (shocker) about the down ratings for the Olympics on NBC. In typical Bloomberg fashion, it’s a clickbait title (who can resist blaming millenials), as the article itself points out that it’s the 18–49 demographic that saw ratings down (with no breakdown inside that demo to determine where the real drop was).

And in the 18-to–49-year-old age group coveted by advertisers, it’s been even worse. That audience has been 25 percent smaller, according to Bloomberg Intelligence.

In response, a millenial (presumably) attempted to defend his peers and lash out at NBC (though, really, it was more about the cable industry) and the inability for the cable/broadcast industry to meet the needs of cord cutters.

The issue I have with the article isn’t so much the argument. I agree that the cable and TV industries are going to have to change the way they think about the broadcast model. And, while it may not be changing as fast as we’d want, it’s changing incredibly fast!

Think about that ten years ago, being a cord cutter meant using an antenna, borrowing DVDs from the library, and maybe downloading a show from iTunes.

Today, you could conceivably use Hulu, Netflix, HBO, Sling TV, and iTunes, and probably cover everything except live sports. And ESPN may be going over the top in the next year. That’s progress.

The article, however, takes almost 1800 words to complain about how difficult it was to watch the Olympics online without a cable subscription and then complains about too many advertisements and the lowest common denominator announcing during the opening ceremonies, as if these are new things. And, while cord cutters are a growing audience, it’s still something like 80% of households who have cable. Cord cutters alone didn’t cause the audience to drop.

No, it’s not until the last segment of the article, which mostly hits on what I believe to be part of the real reason for the ratings being down:

It opened with Simone Biles and Co., but then, despite being filmed earlier in the day, inexplicably goes from the earlier rounds of Gymnastics to Swimming. Hours pass before we finally get to see the resolution to those Gymnastics rounds

The ratings were down because NBC couldn’t figure out how to show events in real time to both the East and West Coasts. With clips showing up online, on ESPN, on Twitter, the audience&emdash;millenials or not&emdash;couldn’t be bothered to stay up until 11:30pm ET to watch the gymnastics finals that had already happened that day. Or worse, for the half of the country on the West Coast, that had happened many hours before.

NBC’s real crime is not figuring out how to get more of the core live events in front of the audience when it was live. Live sports are the only thing left that really can keep audiences from cutting the cord, and NBC (while well intentioned with their wall-to-wall online coverage) forgot that.

In the end, Bloomberg incorrectly blamed millenials, and, in turn, millenials (or this millenial responded in the stereotypically myopic millenial way.

ScanSnap Directly to the Cloud

Last week, Fujitsu added an awesome feature to their ScanSnap scanner line (at least, the iX500 that I have). You can set it up so that, rather than having to have a machine on the same wireless network to pick up the scanned documents, the scanned documents just get shipped to your Dropbox or Google Cloud.

That let’s you do some really interesting things. You can run Hazel rules on your Dropbox folder, just like you can on a local folder, to do automatic sorting, naming, etc. on your machine. You can also do some interesting automation things with IFTTT to trigger other types of activity based off of files getting scanned. Or some combination of both (you scan some sort of receipt, it’s automatically filed into a folder via Hazel, which also triggers an IFTTT action to send an email to someone telling them that receipt is there).

The cloud feature seems small, but it’s a huge improvement to the convenience of what is already a device that has made my life a lot simpler.

Let’s Encrypt SSLs

A couple of months back, I went through the process of trying out Let’s Encrypt to setup some SSL certs for my various little sites. Do my sites really need encryption? No. But, at this point, it’s easy enough to setup an SSL cert, and I’d rather my sites pass their data securely, even if no one cares what goes on between my site and your browser. I’m not storing credit cards or capturing info about my visitors (beyond the analytics Google captures), but in a world where the government is increasingly looking for ways to get at the data of citizens, why not do it.

Plus, it’s free.

It’s a little bit of a challenge to get setup if you’re not already used to mucking around with server management. The newer versions (as of this moment, 0.5.0) make things much easier, but you’re still going to need to be at least familiar with git, python, and sudo.

Once you’ve gotten certs and gotten your servers configured, you just need to remember that these certs expire every 3 months, unlike yearly (or longer) for more traditional certs. Currently, you’re on your own to renew them, but it sounds like they’ll be building out renewal scripts to make it easy.

SSL certs are already reasonably inexpensive (providers like Comodo often sell them for less than the cost of your annual domain renewal), but the ability to get certs for any number of subdomains for free is pretty compelling. Once the automation is in place, they’ll be almost no reason to run a server without https.

(Of course, Let’s Encrypt could be a big government ploy to get everyone to install free certs that they have the key to, and they’ll be able to eavesdrop on all of us with ease.)

Microblogging

Manton Reece has been working on an app/business/service that I think is really in the “own your own Twitter” space. Basically, why not own your own work, rather than just pushing it into Twitter.

It’s something I’ve thought about in the past. If I could post to Twitter and push those to my blog at the same time, it’d give me a full accounting for most of what I do on line (suck in Instagram, and you probably get the totality of it).

I’m interested to see what he comes up with. I think, often, that my Tweets only make sense in the context of the moment. A Celtics game or a concert, or whatever is happening on TV. Some are of the moment in a world sense, and make more sense standing alone.

For example,

Serial is pretty popular, so that stands up on its own alright (and, for fun, go search Bergdahl and Rand on Twitter. It’s amazing.)

This tweet, however,

only makes sense when you realize I was at the Celtics/Clippers game before the All Star break, that the Cs pulled out in overtime.

If you push your tweets/microposts to your blog, even if it’s within the context of your other tweets/posts, can you maintain that context of the moment? I’m not sure.

It’d be amazing if, whether via an app or later inside of your blogging applicaiton, you could add that context. If I could post from an app, that knows my location, and can determine I’m at the Celtics game, and add enough meta-data to that tweet to put it in the context of “Posted from the TD Garden during the Celtics victory over the Clippers”, that’d be pretty amazing.

And it’s not really out of reach today. That tweet could have had geo-data, which would put me at the Garden, during the time the game was going on. I mentioned “game”, which likely narrows the context down even further. If an app/web service could even let me go through my tweets later, tag them with context, and have that flow to my site, that would be a pretty amazingly wonderful service.