jb… a weblog by Jonathan Buys

Nostalgic Development

Like many my age, my first introduction to writing code was creating basic web pages, mimicking what I could find by right-clicking on a site and selecting “view source”. HTML was, and continues to be, simple. There are nested elements inside the top and bottom tags, and the styling sheet defines how those elements are presented. But, somewhere along the line we’ve collectively lost our way.

For example, I recently worked on a rather large Python web app. The basic concept of a web app is fine, it dynamically creates the HTML on the backend and handles the input from the page. A layer on top of the HTML, but a necessary one to develop anything dynamic. The Python environment has its own package manager, and bundling things up is fairly simple. Then the developers decided to do some modernization of the UI, which required significant modifications to the build pipeline.

Instead of a pure Python environment, we now needed Node.js. We aren’t running a node server, we only need it for the build process. Not to build the actual application, mind you, just the CSS and javascript. Node famously comes with its own package manager, npm, and thank goodness, because our site suddenly needs 899 packages in the node_modules directory. Building on top of node we’ve got React and webpack. Webpack is a bundler used to process javascript and SASS files to compile them into javascript and CSS suitable for deployment. Why do we need SASS? I have no idea. I also don’t know why we need to compile our javascript down into bundled javascript.

We’ve taken what was simple and beautiful and piled on so much clutter and junk that it’s nearly unrecognizable from the days of “view source”. As in all things, I’m sure there’s a lot about this situation that I don’t understand. I’m sure that the developers of these projects have good intentions, and see a definite need for their work. It’s just that I don’t see it. I don’t understand why we need these layers of abstraction.

I’ve been creating web pages for 20 years, in one form or another. I really thought that HTML 5 would be a renaissance of simple, usable web development, but for the most part, that hasn’t happened. Well, at least we finally got rid of Flash.

Think About the Future

Over the past twenty years the tech industry has greased the tracks of an express train to dystopia. As age creeps up on me and my hair continues to grey, I think back on the naive optimism of my youth with increasing nostalgia. We live in a world of constant surveillance, persistent erosions of privacy, a decline of democracy, and a rise of populist demagogues. Every new year becomes the hottest year on record, America has an obesity epidemic, and starvation is still a problem around the globe. The Amazon is on fire, opiate abuse is rampant, our kids are suffering from mental health problems, and everyone is too distracted by their phones to care. In short, we’ve made a mess of things.

Being a small part of this industry I can’t help but feel some responsibility. Although I’ve always been a small cog in a massive machine, I’ve been a cog with choices, and those choices did not always turn out as hoped. The easy and human response to our situation is cynicism and scape-goating, blaming the other without accepting any of the responsibility ourselves. I find this kind of laziness unacceptable, an abdication of character and integrity. It’s giving up. We can never give up.

Instead, I again choose optimism. Not naive optimism, but one born of experience and faith. I think most people are good, and clever, and when given the chance want to do their part. I think to share this optimism we first need a vision for the future. Not an apocalyptic future, but one where we’ve solved or are in the process of solving our current problems. Less Mad Max and more Star Trek. Or, somewhat more realistically, more of Microsoft’s most recent Future Vision video. Technology, humanities expression of boundless problem solving ability, must be the underlying foundation for what comes next.

One thing we must agree on before we can move forward is that we can’t go back. We can’t time travel, we can’t bring back “the good ol days”, and we can’t change our culture to recreate an imagined point in the past when things were better. The genie is out of the bottle; we have no choice but to move forward. As uncomfortable as that might make us, there really is no other choice, and anyone who tells you otherwise most likely wants something from you. Smart phones, tablets, social media, the Internet… they are all here to stay. What must change is how we use them. Technology is a tool and a mirror, how we use it shows us who we are.

To solve big problems we must be able to think clearly and concentrate. Luckily we’ve got smart people working on this problem like Cal Newport and Shawn Blanc. I submit that we need a societal shift towards a mentality that treats social media similarly to alcohol. Perfectly acceptable in moderation, can be enjoyable with friends, but improper at work or school. Or, maybe a British attitude is more appropriate, go ahead and have a pint of Twitter at lunch, then go back into your Eudaimonia Machine at work.

This ability to think clearly, without distraction or interruption, must also extend into our school system. We have adopted one-to-one programs across the country that give each child a laptop, and then expect them to have the self-control to be able to use that machine to study, take tests, and do homework. Most of the machines we’ve given them aren’t built to do that by default, they are multi-tasking environments that make it quick and easy to switch between tasks, an accident waiting to happen for an already distracted mind. Once again, we’ve adopted a technology without fully understanding its implications. Technology in education is a broad and deep topic that I hope to cover in more detail in the future. For now, I’ll summarize my position by saying that I advocate for devices like the re:markable e-ink tablet. Not less technology, but tech better suited to the task at hand. Technology that respects our humanity, with all it’s faults and vulnerabilities.

Once we can think clearly it will be much easier to spot partisan propaganda and “fake news”. Without the talking heads on TV, podcasts, youtube and twitter drowning out intelligent conversation we can start to have meaningful debates about things that really matter. As a society, we must indemnify ourselves against phycological warfare like Brietbart and Twitter trolls. We need to be able to identify attempts to promote the false and hateful ideology that seeks to divide us and reject it. The world is awash in mammoth-sized problems, it’s going to take all of us working together to solve them. We must be able to concentrate, then find common ground, and out of that a path forward.

And what is that path forward? What vision should we share? What do we want in the future? Clean air and water. Safe cities, thriving communities. An economy that supports small towns and big cities alike. Work that is respected regardless of if you work with your hands or your mind. Individuals with the freedom to live as they choose, and the responsibility to themselves, their family, and their community that comes with that freedom. The ability to produce and distribute enough food and fresh water that no one goes hungry or is forced to drink bad water. These problems are hard, but not impossible.

I can see a future where our differences are sorted out through vigorous debate. Where our technology is powered by clean, renewable energy. Where we’ve abandoned our dependence on the fossil fuels and plastics that are destroying our environment. Where our food, clothing, and other consumables are sustainable. This is not a utopia, I don’t envision a world without crime or war, but I do envision one with much, much less hate and violence than we currently have. We can turn the tide of the mental health crisis we are currently experiencing. We can defeat the hopelessness and depression that turns people to drugs. We can build technology that prioritizes individual physical and mental health, as well as privacy, security, and autonomy.

We just need to decide to do it. Let’s talk about how.

The September 2019 Apple Event

Several more professional sites have written longer and better articles about Apple’s recent event than I can do here. A few of my favorites in no particular order are John Gruber’s take, Ryan Christoffel and Alex Guyot cover the new iPhones and Apple Watch, respectively, at MacStories, Jason Snell’s take on hits and misses at SixColors, it’s always worth a click to read everyone’s pal Jim Dalrymple at The Loop’s thoughts on the event. And of course, the team at iMore has an entire section set aside for the many articles they’ve already written about what’s new.

This is not a review, just my thoughts on the new products after letting the dust settle for a couple days.

Apple Watch Series 5

I can see myself upgrading my Series 3 for the 5. The bigger screen that debuted in the Series 4 is attractive enough, but the always on screen in the 5 really pushes it over the edge. This is the one Apple device I use every day, all day. For almost two years straight now I’ve worn my Watch nearly every day.

iPad 7

I have many conflicting thoughts about the modern computer for the rest of us. Setting those aside for the moment, this looks like a great update to the entry-level iPad. Larger screen and finally a proper keyboard option, but the same A10 chipset. For $330 this is the right option for someone looking for a casual computing device to take notes, watch video, send and receive emails, and surf the web.

iPhone 11

When compared to the XR, the iPhone 11 is an incremental update with a slightly faster CPU, slightly better battery life, a big update to the camera, and worse color options overall. I’m not a fan of the washed-out pastels, especially when compared to the vibrant and fun colors of the XR. The yellow is especially egregious.

That being said, it’s important to note that this is how Apple rolls. One small incremental update after another, and after a few the iPhone 11 is a massive update in all aspects from something like an older iPhone 6S. Color preferences are just that, preferences. That this year doesn’t match mine doesn’t make them bad, just not for me. What we can’t ignore is that this year Apple ships yet again another incrementally better iPhone, one that’s better in all the ways that matter from previous versions.

iPhone 11 Pro Max Super Duper Cool XDR Edition

Apple really can’t name anything anymore.

Better battery, better camera, better screen, but not better enough to justify the additional $300 the Pro costs. Not to mention that the colors for the Pro are just awful. Speaking of color, that Midnight Green looks like a sad color for a car in East Berlin before the wall fell. The gold is more of a copper, and the white is more of a cream. Space Grey remains the best option for an iPhone that lacks the color options of the 11.

Of all the Apple devices that are “not for me”, the Pro Max is the not for me’ist. I’m actually looking for ways to use my phone less, not more, and I’d rather have a smaller SE-sized phone than even the larger size that originated with the iPhone 6, let alone the Max.

Apple Arcade

$5 per month? Sold. I’m always looking for new games, and I know my kids will get a kick out of this too, especially once it’s available on the Apple TV. I’m even considering getting an Xbox One controller to turn the Apple TV into an almost real gaming console.

Also, I take back what I said previously. Apple Arcade is a great name for the service.

Apple TV+

$5 per month? Sold. I have high hopes for the shows they’ve advertised so far, and I think that over time the TV+ catalog will grow to a respectable size. My current plan is to drop Netflix, pick up TV+, and upgrade to the Hulu and Disney+ bundle. And maybe, someday dropping cable once more.

Miscellany

As discussed on the most recent ATP, the game demos were not good. I also thought they had too many videos, and I miss Jony’s British voiceovers. I also 100% agree with Marco that the forced applause from Apple Retail employees is really starting to feel fake and cringe-worthy. This video for the Watch was Apple at their best.

Enterprise Software Again

I realized today that it’s been ten years since I dedicated an entire post to complaining about enterprise software. In that ten years not much has changed, unfortunately. Enterprise software is still crap, and it’s still more of a hassle than it’s worth. It’s best to avoid whenever possible, so when you find yourself evaluating software or services for your company, here’s a few easy markers to identify the products you should let pass by.

  1. Enterprise software doesn’t want to tell you how much it costs.
  2. Enterprise software often doesn’t even list what it does, instead it want’s to partner with you to provide solutions.
  3. Enterprise software doesn’t provide you technical documentation until after you’ve paid. And even then, it’s lacking.
  4. Instead of real documentation, the marketing department of enterprise software vendors will write “whitepapers”, which are entirely useless.
  5. The user-facing part of enterprise software is almost always complete garbage.

This last point is important because it gets to the crux of what enterprise software is: software wherein the person who pays for it is not the person who uses it. Payment for these solutions is handled by managers who are several steps removed from the daily process of having to put the software in place and use it as intended. What the managers need is a way to justify the exorbitant fees enterprise software vendors charge, so the vender’s sites are full of marketing jargon and various scenarios, hoping to inspire one manager to convince another manager that the price is worth it.

It’s not.

There’s almost always a better way to go about solving whatever problem an organization seeks out a vendor to solve. My personal preference is to solve it in house with open source software and custom development. That way, the money you would have spent on the garbage solution from an enterprise software vendor is spent investing in your own organization. Invest in yourself, solve your own problems, don’t compound your problems by buying someone else’s.

That One Mac Guy

I bought my first Mac in 2004, a white plastic iBook G4. It was slow, the screen resolution was terrible, but wow did I love Mac OS X. After several years of loading every Linux and BSD variant I could find on the PC I bought in ‘99, I finally found a stable Unix-based operating system with a logical and beautiful user interface. The Mac was exactly what I wanted in a computer. I desperately wanted to use it at work, but working in a secure military environment, that wasn’t going to happen.

After I got out of the Navy in ‘06 I got my first civilian job on a six-month contract in Iowa. I was issued another PC, but after poking around a bit I found an old Mac that wasn’t being used, so I adopted it made it work for me. One of the lead engineers saw it once and made the off-hand comment that I should “get that piece of crap off my desk”. I ignored him and carried on. My coworkers were having a LAN party one day after work, and invited me along to play some networked game. I brought my personal MacBook with me, and quickly realized that everyone else had custom built gaming PCs, and that my little laptop couldn’t keep up.

When I found stable employment in Des Moines, I was, again, issued a PC. A Dell laptop this time. Again I found an unused Mac in a closet somewhere, a PowerMac G4, booted it up and used it as my main workstation. After a few years, and knowing my boundaries, I found it possible to work under the radar and bring my personal Mac to work, by now a MacBook Pro, and typically just dropped the Dell in a drawer. From time to time there’d be something I’d need to do with the Dell, and it’d wind up back on my desk for a bit. I remember once a coworker, who would eventually be promoted to my manager, walking by my cubicle and mocking me loudly saying “typical Mac user, Mac in front of him, PC on the side to get real work done.” I didn’t like that guy.

Over the years Macs have become more mainstream and I’ve noticed that they’ve become more accepted at the different places I’ve worked. One thing seems to not change though, whoever is in charge of taking care of employee’s computers always wants Windows PCs. I imagine because they are easier to manage en masse. Even at my latest company meeting, the team was discussing some feature rollout to the PCs, and it came up that I used a Mac1. I quipped that I was pretty sure that by now my using a Mac is a condition of my continued employment. (It’s not.) I further quipped that they could have my Mac… when they pried it from my cold, dead hand.

For my entire working life outside the military, I’ve been the outlier who uses a Mac. By now I’ve been using it exclusively for so many years that I’d be completely lost in Windows. The Mac has a carefully chosen set of tools that mold perfectly to how my mind works. Things are where I expect them to be, they do what I expect them to do. As an information worker, I care deeply about the tools I use. I spend so much of my life using it, I want the experience to at least be somewhat enjoyable. I couldn’t imagine working anywhere that forced me to use a PC, if they did, I’d use it to start sending out my resume immediately.

  1. My whole team uses Macs, but my team is three people, so 🤷🏻‍♂️. 

Setting Up Webster's Dictionary

Via a post I saw today from Chris Bowler, via a newsletter by Sarah Bray, discussing an article written by James Somers, wherein he describes the writing process of John McPhee1, and how he uses a good dictionary to go from last draft to finished work. The emphasis here is on a good dictionary, namely the 1913 Webster’s Unabridged. I won’t attempt to describe how wonderful the dictionary is here, James did a fantastic job of that on his blog five years ago. I will however say that I think his installation instructions for getting the dictionary usable on your Mac are out of date. Here’s the easy way to do it.

First, download the compiled dictionary text. I downloaded it from a GitHub account, but who knows for how long that’ll be available, so I’m hosting the download here2. Webster.s.1913.dictionary.zip

Next, unzip the downloaded file and find the file named “Webster’s 1913.dictionary”. Click on the Finder’s “Go” menu and hold down the Option key to show the hidden “Library” folder. Click on Library, and find the “Dictionaries” folder. Open it, and drag and drop the new dictionary folder into it.

Now when you open the macOS Dictionary app, you can go into the settings (either by pressing ⌘, or by clicking on “Dictionary” then “Preferences…” in the menu bar), scroll down a bit till you find “Webster’s 1913”, click the check box next to it and drag it to the top of the list. Uncheck the “New Oxford American Dictionary”. Now when you click on a word in a good Mac app, then click just a tad bit harder3, you’ll get the definition from the new and improved Webster’s. It’ll also show up in Spotlight searches, and anywhere else the system-wide dictionary is used.

Now you have a far richer and more useful dictionary. A useful resource if you happen to currently be, or soon will be, a college student who needs to write often, and in volume.

  1. Good grief! 

  2. Which, ironically, is also hosted on GitHub. 

  3. If your Mac doesn’t have the force-press feature in the trackpad, you can hit ⌘⌃D while a word is highlighted to get the definition as well. 

A Pox on Anti-Vaxers

Brent Simmons linked to an informational site on the measles outbreak in Washington site on his micro.blog, and mentions how he nearly died from chickenpox when he was a kid. It reminded me of my chickenpox story.

I woke up on my 15th birthday, walked into the bathroom, looked in the mirror, and knew right away something was very wrong. I had red dots all over my body, including my face, which was already in poor condition thanks to the ravages of adolescence. I walked out and showed my mom and she immediately knew what had happened. She sentenced me straight to bed, and for the next week I was sick as a dog. I don’t actually remember a lot of it, but I remember the aftermath. The aftereffects of the chickenpox sores took weeks to fade away, which for my teenage self was worse than the week I spent in bed.

I was older than normal, for my time, to get the chickenpox. The symptoms of the disease tend to get worse with age. I would have given almost anything to have had a vaccine that would have kept me safe, and I’m glad that my kids will never have to experience this particular “right of passage”.

Like the horrid chickenpox sores, anti-vaxers are a symptom of the strange madness that is sweeping the world right now. From moronic flat-earthers to holocaust deniers1, to climate change deniers, conspiracy theorist are having their day. Distrust of institutions, doubting science itself, and a revival of magical thinking are bringing us back to the dark ages of civilization, even as we experience scientific breakthroughs that create a better life for everyone.

My personal theory for all this is that it starts with religion2. Using the best of what we know of scientific methods, we can estimate that the Earth is billions of years old, but a literal interpretation of the Bible shows that all of creation is only a few thousand years old. Since both cannot be simultaneously correct, science becomes the enemy. Making science the enemy leads to a rejection of reason and rational thought itself, which opens a person to be more susceptible to wild conspiracy theories.

It’s wrong. There can only ever be one truth, and when what we see doesn’t line up with what God has told us it simply means we don’t understand one or the other well enough yet. Science isn’t a threat to God. God is not afraid of our questions. Science is the process by which we discover and attempt to understand the building blocks of God.

The science of vaccines is proven. If you care about your children, and the children of those around you, get your kids vaccinated.

  1. See also, scum of the Earth. 

  2. Politics plays a role too, but even then I think the core is religion. 

Inessential Thanks

I believe this will be the last I muck about with the design of the site for the foreseeable future. After being disappointed by the available themes, and further disappointed by my own design ability, I went back to basics. And by basics I mean that I found a few sites that I like the look of and copied large chunks of HTML and CSS to build a custom Jekyll theme.

Readers of Brent Simmons’ Inessential site will probably recognize the fonts and general layout. I’ve added navigation at the top, minimized the layout to bare HTML5 tags, and setup some color here and there. I’ve also set up a bit of responsiveness for media, and syntax highlighting for code blocks. My hope is that this will be a good baseline for any future work I put into the site, but attribution must be made first.

I enjoy the simplicity of the design, and how clean and readable it is now. I especially like that there is no Javascript in use. Nothing but pure HTML and CSS. No tracking, no stats, nothing dynamic or fancy. It’s just you, me, and the text.

PS. Brent was kind enough to give his blessing to the new design, for which I’m greatly appreciative.

Example 50031 of Web Developers Overcomplicating Projects

I spent some time over the past couple nights adopting a new theme for the old digs here at jb. I found the beautiful Chalk theme by Nielsen Ramon and adopted my site to use it, including, finally, a working tags system. I’m quite happy with the tags, but I’m less happy with the bundled deployment system the theme shipped with.

The theme depended on NodeJS to build and deploy to GitHub for reasons that I’m sure made complete sense to the developer but I simply don’t care about. The documentation says to run npm run publish to build and push the site, doing so runs a script that does quite a bit of mucking about with the structure of the site.

# Checkout gh-pages branch.
if [ `git branch | grep gh-pages` ]
then
  git branch -D gh-pages
fi
git checkout -b gh-pages

First thing we do is create a new branch and check it out. So far so good, I guess.

# Build site.
yarn install --modules-folder ./_assets/yarn
bundle exec jekyll build

I’m not familier with yarn, but the site says that it provides “fast, reliable, and secure dependency management”. Ok, fair enough, but what dependencies could my little blog possibly have? Apparently, the package.json file it lists what yarn is downloading:

  "dependencies": {
    "jquery": "^3.2.1",
    "npm": "^6.0.1",
    "retinajs": "^2.1.1",
    "svgxuse": "^1.2.4",
    "webfontloader": "^1.6.28",
    "zooming": "^2.0.0"
  }

Eh… ok. Why do I need NodeJS for this again? So, Yarn installs a bunch of Javascript and then Jekyll builds the site. Moving on…

# Delete and move files.
find . -maxdepth 1 ! -name '_site' ! -name '.git' ! -name '.gitignore' -exec rm -rf {} \;
mv _site/* .
rm -R _site/

Now things are getting interesting. This deletes everything except the git directory, the .gitignore file, and the site Jekyll just built. Then it moves everything out of the _site directory into the root and deletes that directory as well.

# Push to gh-pages.
git add -fA
git commit --allow-empty -m "$(git log -1 --pretty=%B) [ci skip]"
git push -f -q origin gh-pages

# Move back to previous branch.
git checkout -
yarn install --modules-folder ./_assets/yarn

Add, commit, and push the changes to Github under the gh-pages branch, then checkout whatever you had previously and reinstall all the javascript. When I tried this my site went offline. I think this script might be out of date. GitHub requires sites that won’t build in Jekyll to be in the master branch, and if you want to use a custom domain name you have to add a CNAME file with the domain name you want to use.

To work around this I setup a separate repository just for the source of the site and moved the built site into the master branch of the main repository. But, when I pulled everything down on my MacBook, the site wouldn’t compile, with Jekyll complaining about not being able to find Jquery. It was at this point I knew that I had gone down a terrible rabbit hole.

Luckily, I was able to get the site built once, so I had all the “compiled” code to work with. All I needed to do was use those files to build my own Jekyll theme with static assets and none of this Javascript build nonesene. Apparently the original theme was trying to do something fancy with the assets by dynamically renaming them and adding assets selectively to the compiled site. I don’t care about any of that.

Jekyll uses the liquid templating system, so it’s trivial to go through the site and add tags to pull in the content you need during build time. Using the theme as shipped caused me to need three different package managers to build a static site. That’s just not right. What’s so wrong with HTML, CSS, and just a little bit of Javascript?

I don’t know Nielsen, and I’m sure he had good reasons to build the theme like he did. I do think it’s beautiful and I’m thankful that he released it as open source so I could use it. For me though, I don’t need all those layers in my life. I just want an easy way to write and publish my site, and have it look and feel like something I care to have my name on.

It used to be you could learn how to build a web site by right-clicking and selecting “view source”. But now, everything easy is hard again.

Merging the Mac and the iPad

It seems undeniable that, given an infinite timescale, Apple will eventually simplify their two most popular systems into a single platform. Merging MacOS and iOS would, theoretically anyway, provide the users with the best of both worlds, and developers would finally have a single platform to target instead of two. This concept seems to run counter to what Apple executives have said in the past about the Mac, specifically that “the Mac keeps going forever”, but the interview that statement comes from is five years old now, which… in silicon valley terms, really is forever ago.

Since then we’ve seen some interesting ideas come to market, like Microsoft’s Surface Studio, a desktop computer with a 27” 4k touchscreen and stylus , and an odd dial that can sit either on the desk or directly on the screen. Samsung is hooking their Galaxy phones up to a keyboard, monitor, and mouse to use the phone as a desktop computer through the DeX dock. Outside of Apple, touchscreens on mobile computers are nearly ubiquitous, from Windows 10 PCs to convertible Chromebooks. Inside of Apple though, we’ve seen almost no crossing of the streams… almost.

The iPad Pro is a computer unlike any other. Incredibly powerful, but hamstrung by limited software and user interaction capabilities. Geek Bench scores are impressive, but you can’t hook up a thumb drive. It looks like a good device for free-form drawing and artistic work, but outside of basic static sties you can’t use it for web development. More than anything, iOS’s reimagining of an operating system for the modern, mobile world eschews decades of proven user experience work that’s gone into the MacOS user interface. MacOS is consistent, discoverable, and reliable. A good Mac app behaves similarly to the other apps that run on the Mac. Copy & paste, undo & redo, and standard keyboard shortcuts function the same across well-designed 3rd party apps and Apple’s own bundled apps. At least they did, until Mojave.

Mojave introduced four new bundled applications to the Mac. Home, News, Stocks, & Voice Memos were ported directly over from iOS using an as-yet-unnamed unified development framework popularly referred to as Marzipan. Apparently the framework is only half-baked, because the apps themselves do not at all act like they belong on the platform. Keyboard shortcuts are missing, UI elements are entirely out of place, it’s a mess. On a recent episode of The Talk Show, Jason Snell and John Gruber discuss the future of these apps, and Snell suggests that what makes a “Mac app” might be changing to meld around what these new iOS apps on the Mac become once the framework is more stable. Niether Jason nor John are slouches when it comes to discussing the Mac, but in this particular case I think Jason is wrong.

What makes a good Mac app is not an indiscernable feel or look to the application. A good Mac app behaves the way that the Mac has taught people to expect applications to behave since 1984. That’s how an application looks and feels like it belongs on the Mac, when things are where they are expected to be, and the application responds as expected when the user interacts with it. If Apple wants to bring iOS apps to the Mac, I certainly hope they have more in store than this. These iOS apps are going to have to learn to behave how the users of the platform expect them to behave, not the other way around.

In many ways, I think Apple found themselves at this crossroads almost on accident. In fact, I think the “macification” of the iPad is to its detriment. iOS was never meant to be used the way the iPad Pro is advertised. Features like multitasking and windowing seem like they were wedged into the OS when Apple found themselves with a less popular platform than they’d hoped. Apple thought that the iPad was the future of computing… what if they’re wrong?

Apple stubbornly wants the iPad to be the future of computing, so they’ve been focusing on making it more capable for power users, adding more and more hardware power and confusing the pure simplicity of iOS with undiscoverable features and unfulfilled promises. What if, in the next couple years, Apple decides to right the ship and build a truly good MacBook/iOS hybrid?

What about an ARM mac with a detachable touchscreen? Or one that folds over on itself? What if Apple learned all the best lessons from Microsoft’s experiments with their Surface lineup and did it right with the Mac? What about an iMac you can draw on? I’d love to be able to create my OmniGraffle drawings on my Mac with a huge canvas and an Apple Pencil. I’d love to be able to use touch on my Mac to interact with the UI when appropriate, and use the trackpad and keyboard when not. Let the Mac grow the way the users actually want it to grow and let the iPad go back to being just the best tablet on the market. Apple could simplify iOS again, and concentrate on making the Mac the best tool for getting things done.