The value of experience

August 25th, 2014
Filed under: Internet, Politics, Society | Huibert @ 9:13 am

Michael DanielIn an interview with the Information Security Media Group publication, White House cybersecurity coordinator Michael Daniel admits to having no practical experience with the subject matter. Daniel claims that “being too down in the weeds at the technical level could actually be a little bit of a distraction” to his job of advising the president about ongoing and emergent information security issues.

The White House filled the position with Daniel in May 2012, having previously served as the intelligence branch chief in the White House Office of Management and Budget. He believes that the lack of practical experience in the field is offset by masters’ degree in national resource planning and public policy degree. He also credits previous government experience for success in the position, augmented by his martial arts experience.

As the Electronista article states, Daniel isn’t responsible for the technical details of a fix or solution to a country-wide issue. Rather, his job is to assess the situation, and report to the president, and bring other agencies into the fold and “on the same page” about an issue. Senior fellow Jim Lewis at the think tank Center for Strategic and International Studies claims that the lack of experience doesn’t hinder Daniel’s role in the position, claiming that “Computer scientists were in charge and they did a terrible job, being lost in the weeds and largely clueless about policy. You need someone with a strategic point of view and policy skill to make progress.”

Every time I read something like this, I get extremely upset. This theory mostly assumes that there are only two types of persons in any organization, the leaders who can handle any kind of situation and the specialists whose only responsibility is to execute the master plan. This is great for many executives, because unless they are found to be personally responsible for a major screwup, it shields them from any accountability. If something goes wrong, it is never because the plan was flawed in the first place, it is due to poor execution, which can almost always be blamed on managers who are lower on the organizational chart. The problem of course, is that this is simply not true. Most issues in a company, specially in high tech, can be directly traced to a lack of a clear vision that can be communicated to the employees for proper execution. Execs who do not understand their product or market in detail are unable to produce a winning growth strategy, it is that simple. In this context, former GE CEO Jack Welsh is often mentioned as an example of a leader who didn’t need to be an expert in washing machines to turn around a very complex, diversified company. However, there are few Jack Welshes in the world and it is easy to find many examples of successful leaders who were experts in their markets, specially if we only consider fast growing markets, like cyber security. My personal opinion is that Jack Welsh, who undeniably achieved great success at GE, is now used as an example by mediocre executives to try to justify why not knowing anything in their respective fields is not a problem, and this is simply wrong.

MBA programs from prestigious universities are in large part to blame for propagating this idea that lack of experience is not a problem. Business professors usually tell their students from the beginning that they are destined for greatness and that they will learn how to make decisions by learning from the experience of great company leaders. However, unlike what many execs seem to believe, leadership is not about taking decisions by choosing one of the options presented to you by your team. It is about setting a direction and executing on a plan that you have designed. That requires both experience and guts. I don’t know about Daniel’s guts but he clearly lacks experience in cyber security, a skill that is extremely hard to acquire, and that will significantly hinder any attempts he makes to define a “strategic point of view”. Therefore, from my point of view, he is clearly a poor choice for the job. That doesn’t mean that his government experience is not important, it clearly is, but he should at the very least have recognized this shortcoming and explained how he planned to address it, instead of simply dismissing his critics. President Obama is accountable for having chosen Mr. Daniel for this position, but he also shares part of this responsibility. Leaders, to be successful, need to have zero tolerance for mediocrity and that includes their own. Those who accept a leadership position need to be convinced that they are a good fit for the job and that they will be able to deliver results. Integrity begins with an honest introspection exercise to find out if you are the right choice for the position. 

Has Apple really changed?

June 7th, 2014
Filed under: Apple, General, iPad, iPhone, Mac OS X, Macintosh | Huibert @ 2:37 pm


This has been quite an exciting week. Apple has introduced over 4,000 new APIs for both OSX and iOS and most developers have been raving about how the “new” Apple led by Tim Cook has changed. They claim that the company now listens more to their customers and use the fact that iOS will support custom keyboards, allow for the use of the fingerprint reader and offer inter application communication mechanisms as proof that things have changed.

Frankly, I am not convinced that much has fundamentally changed. When Apple launched their Rip, Mix and Burn campaign in 2001, they were clearly listening to what customers wanted at the time. In order to do it properly, they had to plan for their vision, which included buying the application (SoundJam MP) that would eventually become iTunes and add the capability to easily burn CDs, which took some time, but they eventually released the product they wanted.

What happened this week was similar. Apple may have wanted to offer the possibility to install alternative keyboards for a while, but it took some time to deliver the capability in a secure form. What is so dangerous about alternative keyboards? Well, imagine that the keyboard logs all your key strokes and send them to some server in Ukraine. All your passwords, credit cards numbers would be gone. So, what is needed to make sure that supporting alternate keyboards is safe? Well, one way to do that is to avoid using the keyboard to enter credentials or credit card numbers in the first place. That is something that Apple has solved by releasing iCloud keychain in iOS 7 and by opening the use of the fingerprint reader in iOS 8. The other thing to do is to forbid internet access for the keyboard app if the user chooses to do so. That was also announced by Apple as part of iOS 8. It is likely that by the time Apple releases iOS 8, all new iOS devices will include a fingerprint reader and as a result, should be well protected against malicious keyboard apps. As you can see, opening iOS to support alternative keyboards is not something totally new that came out of nowhere, it is the result of careful planning and making sure that everything is in place before launching a new feature.

Swift, the new programming language launched by Apple at WWDC is another interesting example. This new language has been in the works for about four years now. It is a modern language with a lot of new cool features, but I would hardly call it revolutionary. What is interesting about Swift is that, as far as I know, it is the first language designed from the found up to make the use of an existing library much easier. Normally, a language is designed to solve a particular problem that other existing languages cannot handle well (multi-tasking, security, etc.). However, Swift seems to be designed solely for the purpose of giving the Cocoa framework a new lease on life. By basing variable types on Cocoa objects (for example strings are NSStrings) and hiding the complexity of handling structs, Swift makes it much easier to write code for Apple platforms without impacting the huge investment made by Apple and Next on Cocoa over the last 30 years (NextStep was launched in 1989). This makes a lot of sense, because it preserves Apple’s biggest asset while giving us developers what we want. Swift is therefore in that sense evolutionary and not revolutionary. It is the result of a plan launched years ago with the adoption of the llvm compiler, and the launch of Objective-C 2.0, and if Apple is really planning on eventually moving their Macs away from Intel, applications written in Swift will make the transition transparent for application developers.

WWDC 2014 was a great event because it saw the fruition of many initiatives started by Apple years ago, not because Tim Cook just started to listen to their customers and developers but because Apple seems to be accelerating the delivery of features that result from a carefully crafted plan. The success of Apple depends on maintaining a clear long and medium term plan to deliver their vision, as they have done so far, and not on delivering a long list of short-sighted features.

WWDC 2014

May 31st, 2014
Filed under: Apple, General, iPad, iPhone, iPod, Mac OS X, Macintosh | Huibert @ 3:47 pm


Once again, I won’t be able to attend WWDC. I am very excited though that on Monday we will be able to see what Apple has in store for us for the next few years, because I believe that this event will be more about announcing the foundation of things to come than actual products we will be able to buy in June. 

From a developer perspective, Xcode 6, iOS 8 and OS X 10.10 should include enough new functionality to keep us busy for the next few months. Support for larger iPhones will probably translate into a lot of work to prepare old apps for the official launch of the new devices. Something similar is to be expected for Mac developers who will have to deal with a flatter overall design including updated controls. I certainly hope that the changes are more than skin deep, because while appearance is important and having a uniform look and feel across Apple devices can make the user’s life much easier, when I use my Mac, it is all about what I can do with it, great looks come second.

What I would like to see announced at WWDC are improvements around iCloud, namely lower pricing and APIs for Windows, Linux and Android. Writing a cross-platform app that syncs data among devices is not very difficult, there are many scalable document based data stores than can handle this task (Cloudant comes to mind). The problem is persuading customers to pay for the service. Apple on the other hand can do that much more effectively because they already have a large customer base that use the free service or pay for iCloud once a year and get a lot of value by using the service with not one but multiple apps. The value proposition is much better. Sure, there are competing services, like Dropbox, but I like the Apple option better because I can easily assume that all Apple customers have an account.

On the hardware front, I do not have many expectations. Apple has been unable to keep hardware leaks from happening in China in the past and right now we haven’t seen enough credible information to believe a product launch is imminent. If there are any announcements it will be like last year’s MacPro, a simple preview with a launch date, to generate pent-up demand.

I have no doubt that WWDC 2014 will all be about announcing the infrastructure for things to come, namely new services that will be available only to customers with modern hardware (fingerprint reader and the M7 processor as well as future devices) which will generate a need to upgrade old devices and leave the competition in the dust for a while. Apple has had several years to build the infrastructure and plan for this moment. On Monday we will finally understand what Apple has been working on. We may not understand the full reach of these announcements until Apple launches their new devices in the fall, but it will be an exciting event. I will be spending a lot of time on the treadmill next week, watching the WWDC session videos on my Apple TV.

Website redesign

February 24th, 2014
Filed under: General | Huibert @ 2:01 pm

RapidWeaver logo

Yesterday, after working on the project for over a month, I finally published my redesigned home page. It was long overdue, the previous design looked dated and was not designed to support mobile devices. The new site sports a modern responsive design and I really hope you enjoy the enhanced navigation experience.

Although I have some HTML, CSS, PHP and Javascript experience, I certainly do not enjoy web site design. I am a lousy designer and handling the subtleties of rendering the same deign to different browsers on multiple platforms is not something that I enjoy. That said, as most techies I am way too proud to even consider outsourcing a personal web site design to a professional designer.

So, I started by looking for a tool that would allow me to keep design activities to a minimum and focus on content. After evaluating several options, I finally settled on RapidWeaver, a Mac application developed by Realmac Software. This may come as a surprise to many, as RapidWeaver by itself is a quite unremarkable piece of software. It would be quite easy to argue that at US$79.99, it is way overpriced for what it does.

For all of its limitations, though, RapidWeaver provides one great feature, it offers extensibility through the use of plugins. One of these plugins, Stacks 2, achieves an incredible feat, it converts a mediocre product in an unmatched web design tool that is powerful, flexible and easy to use. Sure, you will probably have to invest another US$100 in plug-ins and stacks (take a look at the stacks developed by Joe Workman) to achieve whatever you want to do, but at this point you will be able to create almost any kind of complex web site with almost no effort.

I can’t really emphasize enough how pleased I am with this RapidWeaver based solution and I really recommend it to anyone who wants to develop a professional looking site without going through the hassle and cost of contracting a pro.


December 20th, 2013
Filed under: Politics, Society | Huibert @ 12:39 pm

Phil Robertson You cannot imagine how glad I was to see the controversy that has followed the indefinite suspension of Phil Robertson from his show, Duck Dynasty. For those who are not aware of what happened, A&E, the network that produces the show, decided to remove Phil, after learning that he had given an interview to GQ magazine where he made a number of statements against gay people and also said a couple of things that can be perceived as racist. This is standard procedure and we have seen this happen in multiple occasions. However, this time, something different happened. People started to rally behind Phil, criticizing the decision, defending his right to free speech. Let me be clear, I do not share any of those beliefs, but I am sick to live in a world where those who mean well, try to stifle the freedom of speech of those who think differently.

Back when I was in High School, we had debates on a variety of topics. We discussed the role of prisons, the death penalty, abortion and many other controversial topics. I remember a girl, raised by a well intentioned hippy family, who had a big heart and was wicked smart. We had nothing in common and clashed many times, but we had very interesting discussions and I was really interested in trying to understand her point of view and where she was coming from. It wasn’t just about arguing and winning a stupid debate, it was about trying to understand each other and address the issues. The debate worked because everyone was free to speak their mind, without consequences.

In our modern world, liberty of speech is recognized in most countries, but there can be serious consequences if you say something that is not politically correct. Careers can be destroyed, companies shut down. This is the equivalent of a crime that cannot be tried in penal court, but can be accepted by a civil court. It doesn’t make any sense. Freedom of speech should mean exactly that, freedom to express what you believe, with no restrictions or consequences. Look, I am not naive, I understand why we are trying to muffle some opinions, we want to avoid going from hate speech to violence, and I understand this is a risk. That said, I do not believe that problems are solved by avoiding speaking about the issue. We need to hear what the people have to say, listen actively and act decisively to solve the problems. Yes, some of those problems are very complex, but avoiding the issue will not make it go away, the pressure will keep mounting and eventually we will face an explosion.

Right now, pressure groups are effective because there are few targets that they need to monitor, basically tv and radio networks, large advertisers, etc. I had hoped that the Internet would change that by increasing the number of content providers and making control much harder. However, I may have been too optimistic. Now that we know that the NSA is monitoring all that is said on the Internet, we are all compelled to share politically correct views if we want to avoid trouble.

Maybe it is just the troll in me speaking, but I really would like to live in a world where there is true freedom of speech.

Is XML dead?

November 18th, 2013
Filed under: Enterprise Architecture, IBM, IT Insight | Huibert @ 11:38 am

XMLOver the last few years there has been a relentless attack on XML. First it was on the browser, where it was argued that due to the lack of processing power and need for responsiveness, a data format like JSON that could be easily and quickly parsed by JavaScript was far better suited for the job. Ever since that assertion was made, JSON’s ascension has been unstoppable. Very few HTML clients still rely on XML to exchange data with the back-end application and the numbers are dwindling.

On the desktop application front, XML seemed to be the king of the hill, with all major office suite quickly adopting XML, led by MS Office. Interoperability and readability were thought to easily trump other considerations like size and performance, given that desktop PCs, lacked neither storage space nor processing power. That started to change with iWork 13, the newest release of Apple’s productivity suite. The reason Apple adopted a new proprietary binary format for its flagship product was that it was better suited for use on mobile devices such as the iPad and the iPhone, where space and processing power are indeed a constraint. Since portability among different devices was a key requirement, XML was sacrificed on the altar of mobile adequacy. It is hard to blame Apple for their choice when you look at the numbers they presented. Files now open much faster on the iPad and that is a key factor for customer satisfaction. Microsoft may not with away from XML to a binary format immediately, because many customer built applications rely on it, but they must have taken notice.

That leaves a small space for XML in which to grow, mainly in the integration space. It is hard to envision an Enterprise Service Bus (ESB) that is not based on processing and dispatching XML documents. In this context transaction management is still important, which makes SOAP-based services much better suited for the job than alternatives such as REST services. That said, there is pressure in this space too to try to avoid XML. That means that XML’s place in IT will be reduced to the large enterprise sector, the place it was born. While XML may have seemed to be ubiquitous in the beginning, it’s disadvantages ultimately condemned it over the long run.

I must say that I am quite disappointed to see XML fail in so many spaces and applications. I still believe in its many virtues and feel that in most cases its disadvantages are overblown when the architecture of the application is properly designed with NFRs such as performance and security are properly planned for since the beginning. That said, trends and public perception are hard to fight, specially when some of the concerns are absolutely based on facts. It is clear that the use of XML in any application is no longer a done deal. There will be discussions as to what alternative is better for a particular use case. Ultimately, Enterprise Architects will have to take a decision, and that is good, because XML has become what it should always have been, just another tool on their belt.


November 4th, 2013
Filed under: General | Huibert @ 11:12 pm

LogoI started exercising about 18 months ago. At the time, I could barely walk for an hour at 5.6km/h. Today, I routinely walk 10km in under 1h20m and I often walk 12, 14 or even 16km per day. While I do not expect to win any medals at the next Olympics, what really strikes me is that back when I was sixteen, it took me about 40 minutes to run 5km (yes, I know, I have never been an athlete). Now, thirty years later, I can walk faster for at least twice that distance. Sure, I am probably bit taller now and also slimmer, but still, it is quite astounding what you can achieve by exercising regularly, no matter your age.

What I have learned is that while it is obvious that exercising is key to improve your fitness, collecting data about your progress is equally important. That is where Withings comes into the picture. I bought their Wi-Fi scale last year and was so pleased by the results that I later bought their new Pulse Activity tracker. I now have access to all my data on my iPhone or on their web site. It is easy to see what works and what doesn’t. For example, I used to believe that playing soccer with my colleagues was equivalent to playing Paddle. It turns out that playing Paddle is much more demanding. I also never imagined how little exercise I did on a normal work day. I now try to never go to bed without having walked at least 10,000 steps. It used to be “No pain, no gain”, today it is more like “No information, no gain”.

Right now I feel better than ever and I do not believe I could have achieved it without proper monitoring. I am sure that competing products from Nike or Fitbit work also very well, but what I like about Withings is that they also have a scale and even a blood pressure monitor that complete the solution. I do not endorse products very often, but I am so pleased with these products, that I really felt I had to.

My predictions for tomorrow

October 21st, 2013
Filed under: Apple, General, iPad, Mac OS X, Macintosh | Huibert @ 8:36 pm

UnknownEarlier today I laid out my expectations for tomorrow’s Apple Event. That said, the prediction game is extremely entertaining and I do not want to miss a great opportunity to play it. So, here we go, these are my predictions:

  • iPhone: As part of the introduction, Apple CEO Tim Cook will try to clarify that the iPhone 5S/5C has been a success and will provide some data to assure everyone that the iPhone 5C is not a failure. There will also be some talk about the quick adoption of iOS 7 and how this compares to the Android world. Apple will announce the availability dates for the new iPhones in additional countries.
  • iPad: For the first time Apple will try to clearly differentiate the iPad mini from the iPad. The iPad will be promoted as a tool for content creators, and that claim will be backed by a powerful processor, the A7, and a smart cover that will include a keyboard. In addition, the iPad will include the same fingerprint reader built in the iPhone 5S for additional security (no multiple accounts for now). That is also why the new iLife and iWork applications will be showed off on the large screen iPad. The iPad mini will be pitched as a tool for content consumption with a Retina Display to enjoy movies, books, magazines and games. Apple will showcase the mini with third party game controllers made for iOS and games updated for iOS 7 that support the new APIs. The new iPads will not be available in multiple colors.
  • iPods: This is no longer a strategic product category for Apple, and as a result I do not expect major changes to their 2013 lineup. That said, I believe that the iPod touch will be updated with an M7 chip, to prepare for the iWatch launch in the first half of 2014 (for more on this read my previous blog entry on this subject).
  • Mac Pro: This is the ultimate machine for video professionals. As a result, Apple will use it to demonstrate new versions of their Pro software tools in addition to OS X Mavericks (which will be launched over the next two weeks). The demo will run on multiple screens, which means updated Thunderbolt Displays that will include two Thunderbolt 2 ports. There is a good chance those displays will support 4K (although there may be two versions of these displays, one with regular resolution, the other with 4K). The Mac Pro will be a very expensive computer, with a low entry point below US$3,000 but with customization options that will bring that price easily around or above 10K. In order to sweeten the deal for their target audience (and to fight agains Adobe and Avid) Apple will offer bundles that include single user licences of their Pro software, preinstalled. An upgraded Mac mini will be quickly mentioned with 802.11ac, an updated Haswell processor and maybe (this is a long shot) Thunderbolt 2.
  • Mac Books: Apple will release updated Mac Books (including the Pro models) with 802.11ac and Haswell processors. The main selling point will be a significantly longer battery life. Retina screens may come to all models (this is also a long shot).
  • Apple TV/iWatch: There will be no new product announcements this year, but we may see a price drop on the current Apple TV in order to better compete during the holiday shopping season. We will have to wait for another three to six months.

And that is it. There will not be a new product category this year, but it will still leave Apple with a solid product lineup for Christmas that should allow them to have a successful quarter.

What do you think?

Update: I was definitively wrong about iPad differentiation. Choosing an iPad is harder than ever. Apple covered all the price points but did not explain why people should choose one model over another depending on their needs. I think this is a mistake, most people need to be guided and confusion can be a sales inhibitor. The iPod touch wasn’t updated either.

Apple’s October 22nd event

October 21st, 2013
Filed under: Apple, General, iPhone, iPod, Mac OS X | Huibert @ 9:52 am

Oct 22 2013 invite 2703661bNext Tuesday Apple will unveil a lot of goodies. It is widely expected that the Cupertino based company will unveil new iPads (probably with some new cover), updated MacBook Pros and a totally overhauled Mac Pro, as well as many software updates to both its consumer and professional software offerings. That should be more than enough to justify a lot of excitement among the Apple faithful.

That said, there are still many unanswered questions. Take me for example, I am looking to replace my first generation unibody iMac and would also want to buy a second monitor that I could use both as a second screen for the new computer and as the main screen for my Mac Book Pro when I need to do some work at home. The issue I have right now is that the 27” iMac does not align with the 27” Apple Thunderbolt Monitor. This is a big mistake that is preventing many iMac owners from buying a second monitor. Apple has to know about this issue, since there are complaints on its forums, and will most probably fix it at some point, hopefully on Tuesday.

Right now there is little chat about an update to the Apple Thunderbolt Display, but there are reasons to hope for an updated model. The first reason is that with the release a new Mac Pro with Thunderbolt 2, Apple needs to either update their Displays to support it or release some kind of Thunderbolt 2 dock. My money is on the first option. The second reason is that the current design of the Thunderbolt Display is reminiscent of the previous generation iMacs and that a slimmer design is long overdue. If Apple updates their monitors and they align nicely with the new iMacs, my problem is solved.

On the other hand, if Apple upgrades the Thunderbolt Display but does not fix the alignment issue, the solution is to buy two monitors and a Mac mini (or a Mac Pro, but that will likely fall out of my budget range). That is more slightly more expensive and I will get less bang for my bucks, unless the Mac mini is also updated on Tuesday (hopefully with Thunderbolt 2, a new Haswell processor and 802.11ac wireless networking, which would be nice. This is a possibility, but we don’t know for sure if this will happen, since no rumors point in that direction (although it would be logical to expect an update to the mini at this point).

Of course, there are other possibilities. Apple could choose to release new (expensive) 4K monitors for the Mac Pro and not update their current products. That would be great for pros, but would leave me wondering if I should invest my money in products that haven’t been updated in quite a while.

The fact is that even though we already know many details of what will be announced on Tuesday, for Mac users there still are many unanswered questions that will keep us excited. We may well be riding a truck in a car age, but what can I say, I still love my truck.

Update: It seems that i was overly optimistic. No new Mac mini and no new monitors. Who knows, there may be new monitors in store when Apple releases the Mac Pro in December. I will wait patiently…

We have the facts. Are we reaching the logical conclusions?

September 12th, 2013
Filed under: General | Huibert @ 1:18 am

20130910 134447 XL 540x303When I attended WWDC in 2006, Apple introduced Core Animation. It was a great technology for OS X that allowed to easily create dynamic interfaces. Everyone was wowed by the demos, but nobody was really sure how they could use the technology in their applications. Why? Because the technology had not been developed with the Mac in mind. That became painfully clear in January when Steve Jobs introduced the iPhone at MacWorld. To this day I blame myself for not understanding what was happening. We had all the facts, we knew that Apple was working on a phone, we knew they were interested in touch technologies and we had seen Core Animation. Yet nobody, including myself, did make the connection.

When Apple introduced the iPhone 5S and its new M7 chip, I remembered what happened in 2006. Something did not sound right. So Apple is adding a new chip to its flagship phone just to allow Nike to develop a new app? Granted, this chip could be used  by other companies to develop innovative apps, but I think there is more to it.

Samsung just released a smart watch that is plagued by problems. The most obvious issue is the very limited battery life of the device. That is mainly due to the fact that the watch is a (slow) computer in its own right and that it includes a camera and sensors. The other problem is price, US$300 is quite expensive for a phone accessory. How can a company produce a cheaper, more powerful watch with better battery life? Simply by offloading most of its work to the phone. Could the M7 chip be the sign that Apple is moving closer to releasing a watch based on this design principle?

Most analysts seem to believe that Apple will release a smart watch in the first half of 2014. Nobody expects Apple to release a new iPhone until the second half of the year. That means that if Apple’s rumored smart watch relies on some kind of dedicated hardware it has to be included in the current generation of iPhones. I think that the M7 chip is that dedicated hardware.

I understand that my reasoning could be wishful thinking. I may still be obsessed by my failure to understand why Apple had developed Core Animation. Yes, maybe. On the other hand, the more I think about this, the more sense it makes to me. What do you think?