How to pay for the Internet, part 0xDEAF0001

Today’s Wall Street Journal had an article about Facebook, in which they promise to change the way the serve advertising in order to defeat ad blockers. This quote, from an FB spokesperson was choice:

“Facebook is ad-supported. Ads are a part of the Facebook experience; they’re not a tack on”

I’ll admit, I use an ad block a lot of the time. It’s not that I’m anti ads totally, but I am definitely utter trash, garbage, useless ads that suck of compute and network resources, cause the page to load much more slowly, and often enough, include malware and tracking. The problem is most acute on the mobile devices, where bandwidth, CPU power, and pixels are all in short supply, and yet it’s harder to block ads there. In fact, you really can’t do it without rooting your phone or doing all your browsing through a proxy.

The ad-supported Internet is just The Worst. I know, I know, I’ve had plenty of people explain to me that that ship has sailed, but I can still hate our ad-supported present and future.

  • Today’s ads suck, and they seem to be getting worse. Based on trends in the per ad revenue, it appears that most of the world agrees with this. They are less and less valuable.
  • Ads create perverse incentives for content creators. Their customer is the advertising client, and the reader is the product. In a pay for service model, you are the customer.
  • Ads are an attack vector for malware.
  • Ads use resources on your computer. Sure, the pay the content provider, but the cpu cycles on your computer are stolen.

I’m sure I could come up with 50 sucky things about Internet advertising, but I think it’s overdetermined. What is good about it is that it provides a way for content generators to make money, and so far, nothing else has worked.

The sad situation is that people do not want to pay for the Internet. We shell out $50 or more each month for access to the Internet, but nobody wants to pay for the Internet itself. Why not? The corrosive effect of an ad-driven Internet is so ubiquitous that people cannot even see it anymore. Because we don’t “pay” for anything on the Internet, everything loses its value. Journalism? Gone. Music? I have 30k songs (29.5k about which I do not care one whit) on my iThing.

Here is a prescription for a better Internet:

  1. Paywall every goddam thing
  2. Create non-profit syndicates that exist to attract member websites and collect subscription revenue on their behalf, distributing it according to clicks, or views, or whatever, at minimal cost.
  3. Kneecap all the rentier Internet businesses like Google and Facebook. They’re not very innovative and there is no justification for their outsized profits and “revenue requirements.” There is a solid case for economic regulation of Internet businesses with strong network effects. Do it.

I know this post is haphazard and touches on a bunch of unrelated ideas. If there is one idea I’d like to convey is: let’s get over our addiction to free stuff. It ain’t free.

 

 

The future of electrical engineering as a profession

The other day I was watching Dave Jones, a video blogger that I find entertaining and informative. His blog, the EEVblog, is catnip for nerds who like to solder stuff and use oscilloscopes.

Recently he did a short segment where he answered a question from a student who was upset that his teacher told him that EE was perhaps not a great field for job security, and he sort of went on a colorful rant about how wrong the professor is.

The professor is right.

Electrical engineering employment is indeed in decline, at least in the USA, and I suspect, other development countries. It’s not that EE skills are not helpful, or that understanding electronics, systems, signals, etc, are not useful. They are all useful and will continue to be. But I think more and more of the work, in particular, the high paying work, will migrate to software people who understand the hardware “well enough.” Which is fine. The fact is that EEs make good firmware engineers.

I think someone smart, with a solid EE background and a willingness to adapt throughout your entire career, should always find employment, but over time I suspect it will be less and less directly related to EE.

I mostly know Silicon Valley. Semiconductor employment is way down here. Mostly, it is through attrition, as people retire and move on, but nobody is hiring loads of young engineers to design chips anymore. It makes sense. Though chip volumes continue to grow, margins continue to shrink, and new chip design starts are way down, because “big” SOCs (systems on chip) with lots of peripherals can fill many niches that used to require custom or semi-custom parts.

I suspect that the need for EEs in circuit board design is also in decline. Not because there are fewer circuit boards, but because designing them is getting easier. One driver is the proliferation of very capable semiconductor parts with lots of cool peripherals is also obviating a lot of would-have-been design work. It’s gotten really easy to plop down a uC and hook up a few things over serial links and a few standard interfaces. In essence, a lot of board design work has been slurped into the chips, where one team designs it once rather than every board designer doing it again. There might be more boards being designed than ever, but the effort per board seems to be going down fast, and that’s actually not great for employment. Like you, I take apart a lot of stuff, and I’m blown away lately not by how complex many modern high volume boards are, but how dead simple they are.

The growth of the “maker” movement bears this out. Amateurs, many with little or no electronics knowledge, are designing circuit boards that do useful things, and they work. Are they making mistakes? Sure, they are. The boards are often not pretty, and violate rules and guidelines that any EE would know, but somehow they crank out working stuff anyway.

I do hold out some hope that as Moore’s law sunsets — and it really is sunseting this time — there will be renewed interest in creative EE design, as natural evolution in performance and capacity won’t solve problems “automatically.” That will perhaps mean more novel architectures, use of FPGAs, close HW/SW codesign, etc.

Some statistics bear all this out. The US Bureau of Labor Statistics has this to say about the 2014-2024 job outlook for EEs:
http://www.bls.gov/ooh/architecture-and-engineering/electrical-and-electronics-engineers.htm#tab-6

Note that over a 10 year period they are predicting essentially no growth for EE’s at all. None. Compare this to employment overall, in which they predict 7% growth.

One final note. People who love EE tend to think of EEs as the “model EE” — someone clever, curious, and energetic, and who remains so way for 40+ years. But let’s remind ourselves that 1/2 of EEs are below median.  If you know the student in question, you can make an informed assessment about that person’s prospects, but when you are answering a generic question about prospects for generic EEs, I think the right picture to have in mind is that of the middling engineer, not a particularly good one.

I’m not saying at all that EE is a bad career, and for all I know the number of people getting EE degrees is going down faster than employment, so that the prospects for an EE graduate are actually quite good, but it is important for students to know the state of affairs.

Narrow fact-checking is less than useless

Last night, Donald Trump gave a speech that included a bunch of statements about crime that the New York Times fact-checked for us. This summarizes what they found:

Many of Mr. Trump’s facts appear to be true, though the Republican presidential nominee sometimes failed to offer the entire story, or provide all of the context that might help to explain his numbers.

Putting aside the ridiculously low bar of “many facts appear to be true”, they failed to mention or explain that in every case, despite his factoids being narrowly true, the conclusions he was drawing from them, and suggesting we draw from them, were absolutely, incontrovertibly false.

This kind of reporting drives me bonkers. Crime stats, like all stats, are noisy, and from one year to another, in a specific city, you can find an increase or decrease — whatever you are looking for. But the overall trends are clear, and Trump’s assessment was utter bullshit.

Another, somewhat less savory, media outlet did a much better job, because they took the 10 minutes of Googling necessary to assemble some charts and put Trump’s facts in context.

Would it have been partisan for the NYT to put Trump’s facts into context with respect to the conclusions he was drawing from them? It just seems like journalism.

 

Gah… Apple

I use a Mac at work. It’s a fine machine and I like the screen and battery life, but I’m not a generally fan of Apple the company or its products. Sometimes I forget why, and I need to be reminded.

Like today, when I decided, even though Safari is basically a sucky product, there are probably people that use it, so I might just port my little political statement Chrome extension to Safari. I’d already done so to Firefox, so how hard could it be?

Well, it turns out, not too hard. Actually, for the minimalist version that most people are using, it required no code changes at all. It did take me awhile to figure out how everything works in the Apple extension tool, but overall, not too bad.

I knew I would have to submit to reviewers at Apple to get it published. I had to do the same at Mozilla for Firefox. But what I did not know is that in order to do that, I had to sign up to be an Apple Developer. Moreover, I could only do so under my real name (ie, not dave@toolsofourtools.org) and most annoying, they wanted $99. A year. or as long as the extension is up.

I’m not going to play $99/yr to provide a free plugin for the few people who are dumb enough to use Safari on a regular basis.

In an odd way, this gets right to the heart of one of the many reasons I do not like Apple. They are constitutionally opposed to my favorite aspects of the computing and the Internet: the highly empowering ability for people to scrappily do, say, make anything they want for next to nothing, and at the level of sophistication that they want to deal with. Apple doesn’t like scrappy things in its world, and actively weeds them out.

Apple, you suck. Thanks for the reminder never to spend my own money on your polished crap.

Simulate this, my dark overloards!

Apparently, both Elon Musk and Neil deGrasse Tyson believe that we are probably living in a more advanced civilization’s computer simulation.

Now, I’m no philosopher, so I can’t weigh in on whether I really exist, but it does occur to me that if this is a computer simulation, it sucks. First, we have cruelty, famine, war, natural disasters, disease. On top of that, we do not have flying cars, or flying people, or teleportation for that matter.

Seriously, whoever is running this advanced civilization simulation must be into some really dark shit.