The Mac vs PC dilemma for a programmer Part 1: The problem that is Windows

Having recently started my Computer Science and Electronic degree. There was one thing that shocked me most: the ubiquity of Macs. Having no coding experience or knowledge, I had assumed that Macs were not for developing software. It turns out I was wrong. In fact, most people with programming experience (at my university) own and use a Mac. That’s not to say owning a Mac makes you a better programmer, but it does make your life a lot easier.

There are two problems that lead programmers to use a Mac:

The first problem is Windows. More specifically, the proprietary standards and procedures that Microsoft enforce in Windows. Installing C on Windows isn’t too difficult, but it requires more effort than a Unix OS like MacOS. Then you have to be careful of which version of Windows your program is compatible with. Windows has a lot of legacy support. Which is nice for companies that want to run insecure and slow 20 year old software, but incredibly inconvenient for programmers because you have to constantly be aware of the weird nuances that Windows has, but other operating systems do not. Installing SDL is far from a pain-free experience like it is on Unix.

Of course, this depends on what languages you use. Java for instance, is completely fine on Windows, comparable to Mac and Linux in every area.

This ultimately leads to a reliance on many workarounds. This is not good for a programmer who wants their program to compile and run on almost any machine.

It is now that we must bring up Linux. The saviour for Windows users who want to write programs! Well… not quite. Linux is far far far better than Windows for programming. Things work without workarounds and installing compilers and software is never more inconvenient than copying and pasting a few commands into terminal.

This leads to the second problem: Linux comes in many different flavours all with their own set of problems. Ubuntu, for example, the most Mac/Windows like distribution is ugly (IMO), and doesn’t support apps like Microsoft Word (without hours of messing about). This makes Linux great for programming, but inconvenient for everything else. Linux distributions are not designed in the same way as MacOS and Windows. MacOS and Windows have to appeal to a large market, that’s why they have to look fairly pretty, stay fairly consistent and provide adequate support. A Linux distribution does not have all of these goals. It may have some of them, for example Elementary OS focuses on aesthetics.

Ultimately, PC users usually have to stick with a dual boot. This is an inconvenience which can be detrimental to productivity. Having an OS that just works is something that I believe most programmers want, especially when working on mission critical projects.

Here we are, the solution: MacOS. It has all of the conveniences of Windows: all the standard software you need, a pretty layout and a consistent design. It also has the conveniences of Linux (as it is also built on Unix): easy installation of compilers and standards and no need for workarounds.

All seems well and good for Apple. However, in part 2 I will focus on hardware: the final crux of this problem.

Note: I was using a Windows PC until May of this year when I made the switch to a MacBook Pro 13″

The Post-PC era has yet to Arrive

In 2013, Tim Cook called this the post-PC era. But that hasn’t really turned out to be the case. The PC still stubbornly sticks around.

It certainly is true that the peak for the PC has past. PC sales (laptops and desktops) were greatest in 2011, and it has only declined since. But tablet sales have also declined, with their peak being only 2 years after that of the PC [statista]. The trend therefore, is clearly not that people are replacing their desktops and laptops for tablets. Instead, people are buying fewer of these devices year on year.

In truth, the tablet was never going to replace the PC. Despite what Apple (or its CEO) have tried to claim. (Microsoft claimed this before Apple, but it was clear that what Apple and Microsoft class as tablets are totally different.) And, I’ll be honest, I don’t understand why Tim Cook said that. He of all people must realise the unique opportunities the PC brings.

In a fairly recent statement, Cook said:

“The desktop is very strategic for us. It’s unique compared to the notebook because you can pack a lot more performance in a desktop — the largest screens, the most memory and storage, a greater variety of I/O, and fastest performance. So there are many different reasons why desktops are really important, and in some cases critical, to people.”

[TechCrunch]

It seems that Cook did deliver on that promise just yesterday at WWDC, with a refresh of pretty much all of the Mac line-up (sorry Mac Mini), including the Mac Pro which we can assume will be replaced by the iMac Pro. So it would appear that Apple’s post-PC mentality was a fad, a blip in history, hastily to be forgotten. Just a reminder of that history:

Just before WWDC yesterday, the Mac Pro was over 3 years old (still technically is), the Mac Mini was over 2 years old (and terrible) (yeah, that’s still true), and the iMac hadn’t been updated since October 2015.

It’s always easy to take history out of context, so compare that to Apple’s previous record: the average refresh cycle for the iMac is 317 days compared to 460+ days since the last iMac before WWDC yesterday [MacRumors]. The Mac Pro also had a 3 year gap between the 2010 model and the 2013 trash can, but the 2010 model was future proofable making this gap less painful for many prosumers. So clearly, this was not normal Apple behaviour.

Apple also released a new iPad Pro and a slew of new features specifically for iOS on the iPad, bringing it more in line with the MacOS desktop. It’s clear that Apple sees a future with limited room for the PC, but for now the PC is very much part of the future.

Goodbye Wunderlist

It always seems to happen, Microsoft buys a small company that builds a really cool productivity app… and kills it. Sunrise took the bullet, now its Wunderlist’s turn. Whilst Wunderlist may may still be on the App Store, it’s just waiting to be pulled off. Microsoft have officially killed off Wunderlist, in favour their new app Microsoft To-Do.

To-Do is made by the same lot who made the Wunderlist we know and love, but there’s always something special about an app that isn’t in the grips of one of the big three. This isn’t just a case of supporting the underdog. There are benefits of using apps from the small guys:

  • they’re forced (with exceptions) to be multi-platform
  • they rely on their customers first (not shareholders)
  • they are (usually) easier to connect with

Thankfully, Microsoft are usually good at these three things, and Microsoft To-Do looks promising with Microsoft promising to implement the best of Wunderlist into To-Do. Once this happens, Wunderlist will be (officially) dead.

Why Microsoft chose to reinvent the wheel they had in their shed, I have no idea? And it seems ridiculous, but if you’re interested in that look here.

Wunderlist has been a go-to tool in my arsenal for over half a year, and a great one at that. I could continue using Wunderlist and eventually migrate to To-Do, but I have decided to use this opportunity to try something new for my to-do listing needs. So now I’m using Todoist. I may give my full thoughts after I’ve fully become acquainted to it.

I’ll miss you Wunderlist and the view you gave me of Fernsehturm.

Hardware/Software

Almost all modern consumer electronics is comprised of two parts: hardware and software. We’ve always had hardware, but software is something that has exploded only in the last few decades. And as software has exploded, it has proceeded many of the roles that hardware used to handle.

Of course, hardware is needed for software, what good is software without any way of seeing it or more importantly writing it? But eventually, I see hardware as becoming so subtle that it ceases to be important to the user. The software will be all we interact with, but even that will be ethereal. The benefit of this will be that we will no longer be using ‘devices’, ‘devices’ will just do things for us.

Currently, if we need to find something out, we go through this process:

  1. How will I find this? A phone call? Email? Message? Google search?
  2. Carry that action out and hopefully it works out
  3. If it doesn’t, rinse and repeat

In the future, this process could be as simple as, asking a question. Instantaneously, the computer analyses your questions and carries out these steps for you. For lack of a better word, it’s ‘frictionless’. This kind of technology may be a few decades in the pipeline, but in the near future, many of the barriers we currently face will be removed through two simultaneous processes: the prevalence of software and the dissipation of hardware.

An example of this through the prevalence of software, is Apple’s handoff feature, which allows a user to switch from their Mac to their iPhone/iPad (or vice versa) without even thinking about it. This might not seem like an obvious example of the dissipation of hardware, but by allowing this coherent transfer between devices, the device becomes less important.

In terms of hardware, the almost complete disappearance of bezels on the Xiaomi Mix removes much of the mental barrier when using a phone. It makes watching movies and playing games more immersive, but more importantly you forget that you’re holding a phone.

Virtual and augmented reality will only accelerate this inevitable seamless future, in which technology, does things for you, rather than you doing things with technology. On the other hand, it could be argued that it is us doing things for the technology. If you thought people being slaves to their smartphone screens was bad, imagine people with their VR headsets, not even seeing the real world they live in.

Ultimately, we will embrace the new technology as is always the case. However, what will decide whether its impact is positive or not, is how we choose to use it. It has the power to connect us with more people in a more profound way, as did Facebook when it first arrived. It’s all a matter of how people adopt and use the technology.

Why I quit Vipassana

Vipassana is a meditation technique, for more information see this Wikipedia article

After the brutal 10 days that started my quest of Vipassana I felt happier, more energetic and most importantly at peace with myself and as a result with others. I was a strong advocate of the course, recommending it to friends and family. I would say the 10 day course worked. It was a success.
But one course, despite it totaling over 100 hours (4 days) of pure meditation will not get you to the ultimate goal, like anything you must continue the practice in a sufficient way to not only maintain its benefits but also increase them.
S.N. Goenka recommends at minimum, 2 hours every day: 1 hour in the morning and 1 hour at night. It sounds like a lot because it is a lot. 2 hours every day over 2 weeks totals more than a whole day (1 and 1/6th days). Over a year, that amounts to 2 whole weeks. And if you practice Vipassana for 2 hours every day over 50 years that equals to 2 years of your life. It’s crazy.
And, yes, the benefits of Vipassana are definitely non-trivial, but 2 hours every day isn’t trivial either. Even if you can find the time, I find it incredibly difficult to stay motivated to continue practice (often, after 15 minutes my mind convinces me to stop wasting time and start doing something productive).
Continuity of practice is the key, says Mr Goenka himself. So practicing occasionally may provide benefits sporadically, but so will exercising daily. To feel the true benefits of Vipassana you need to commit a lot of time and effort. This isn’t one of those: “in the 21st century people don’t have time too…” statements, this is a “unless you feel the benefits immediately, 2 hours a day is impossible”.
Maybe, I’m doing something wrong. Maybe, I’m playing ‘the sensation game’ (Mr Goenka says that many people who keep coming to courses and still feel little benefit are playing a game with their sensations instead of just observing them). I honestly don’t know. However, this technique is meant to be universal so there must be something I can do to keep it up.

Calculations
2 hours a day = 1 and 1/6 days every 2 weeks
= 2 weeks a year
= 2 years over 50 years

Apple’s naming convention still doesn’t make sense

At Apple’s annual developer conference, WWDC, Apple retired a name they’ve been using for 15 years. Unlike OS 9s funeral, OS Xs wasn’t so dramatic and for good reason: it’s not a big change. OS X is now becoming macOS to fit more in line with iOS, watchOS and tvOS. The problem is, it still doesn’t make sense.

That looks wrong:

macOS

That’s better:

OS X

Ooh, so pretty and streamlined. Until you look closer.

  • iOS 10
  • macOS Sierra
  • watchOS 3
  • tvOS

That doesn’t look right either. It’s a confusing mess, iOS and watchOS are arithmetically assigned, whereas macOS has a semi-random name. It’s really not a big deal, but I will say this: I can’t imagine Steve Jobs allowing this.

The Height of Computers

Artificial intelligence (AI) is the future and we have to accept it. But it will also be the demise of mankind. Thankfully, for most of us we’ll be dead before we get to see the demise of our species. But still, don’t be too sad.

I have been convinced by the likes of Stephen Hawking that AI is dangerous. It will evolve at a rate we cannot even imagine. Eventually, it will replace us, like how the Neanderthals were defeated by the homo sapiens. The better, smarter and more cunning species won. It doesn’t sound like a very positive look at the future. And I will agree, it’s upsetting to think that mankind’s greatest achievement will lead it to its ultimate fate. At the same time, AI is the next biological-digital evolution. It sounds unprecedented, but it is not too unlike the story of how the Neanderthals were entirely replaced by homo sapiens.

From all this, I have come to the conclusion that we just have to accept it. I even aim to work in some part of the AI industry that will lead to our downfall. I do not subscribe to the belief of fate or nature being controlled by some sort of external being, but I do believe that nature has a way of self-improvement. It will let the better side win and in this case it will be AI. How they will win is anyone’s guess. I’ve even got a few guesses of my own:

  • AI outright captures us and destroys us, similar to what the Cylons attempt in Battlestar Galactica
  • A combination of climate change and AI: we cannot live in the extreme conditions on Earth, but AI can

That second possibility also acts as another reason for why we need to accept that AI will replace us. If we do not develop AI, climate change or an unforeseen event will destroy us. We should at least go out in style, not in pity and desperation. AI could be our redemption for treating our planet so poorly.

You may now feel a bit heroic. Let’s make a plan. We can produce AI, redeem ourselves, but still live on, right? We could cheat. If we all moved to space, before the AI were smart enough to realise, perhaps we could escape. Live our lives completely separately from AI. It would be like time travel, going to a snapshot in time before AI.

The problem with this plan is that you could then not answer the question: ‘Why did you make AI?’ For this plan to succeed we would need to have it all sorted out before the AI were even useful tools which would then mean that we created the AI for no reason. We created them, then left because we knew they would kill us, but they didn’t even help us. It would be a complete waste of time. So, in reality what would happen is AI would start doing dodgy things like in the movie I, Robot by which time it would be too late to jump off into space. By the time we were ready to do so, guess what, they’d be smart enough to figure out and put a stop to it.

To leave you on a slightly more uplifting but still bleak look of the future, perhaps now (the next few decades) will be the height of computers. Computers before they do everything for us and turn us into trivial beings. Cherish this moment in time because it may be our best.

HAL-9000
I’m sorry Dave, I’m afraid I can’t do that. (2001: A Space Odyssey) The ‘flawless’ HAL-9000 betrays its masters.