03 February 2010

Why bother with strategy?

The following is a response I posted to a blog article titled "Four Reasons Why Productive People Hate Strategic Planning." The article riled me a bit and I couldn't help but post.

-----------------------------------------------------------------

Without a strategy, that is, a goal, how do you know you're not just wasting your time with make-work? The purpose of a strategy is to define the end goals and even some of the intermediate goals. It is not intended to lay out the minute detail of how to get there. Without strategy, you're just blundering about in the dark.

With a strategy, you should be able to answer questions like: "How do I know if I'm on track?", and "How will I know when I get there?" Like reading a street directory, you should be able to say things like "If I come across this intersection, I know I've gone too far but I can fix it by making the next two left turns."

Secondly, a strategy meeting must make decisions. That's the implication of the use of the word "plan." I'm betting that you've suffered too many strategic plans that make no decisions or make tactical, implementation decisions, or those decisions just haven't been visible to you.

Let's take a military example. A general defines a strategic battle plan, outlying broad but specific objectives. This gets broken down and along the line, a regimental commander is instructed to advance on a specific port and hold it to facilitate further landings. This gets further broken up and passed down the line, ending up with platoon commanders who meet the enemy and need to decide how to attack them. Ultimately, a section commander (a Corporal) is tasked with sneaking around onto an enemy's flank and attacking individual soldiers. To the corporals, the strategy is something that may be known but their concern is the soldiers immediately to their front. When you're pounding the ground, it's hard to see the far objective. I guess that's why generals sit up high in their ivory towers - so they can see the objective. The platoon leaders and section commanders have different objectives - take this enemy position and don't get hurt trying.

That's what strategy is about. By the way, I'm a pleb. Without strategy to guide me (or at least it guides my boss, who guides me), I'm just wasting my time and that stinks.

23 January 2010

Photography - my new obsession

I got to the end of 2009 and realised my life revolved around computers - work, study, (ahem) free time.

So I needed a new hobby.

  • Cycling - fun but not always portable.
  • Keep fish - they were bound to die early, I don't think my landlord was really going to go for it, I'd forget to feed them or clean the tank enough, and I move home a bit too regularly.
  • Get a pet - I rent OK...
  • Photography - portable, can take ages or hardly any time at all, I drowned two Olympus SLRs 10 years ago but I'm smarter now, right?

So, one tax return later, new camera gear. And you know what, it doesn't cost as much to make bad photographs anymore. Digital is really great. So I thought I'd give a few impressions of my gear and my re-introduction to photography.

I bought a Canon 450D and Canon 17 - 85 mm kit lens just after Christmas from Ted's Camera.

I checked out about four shops in Melbourne and thought about buying online too. When I spoke to sales reps in the shops in the city, I was pushed towards Nikon gear - D5000 or D90. Having a look at the shop windows, I could have sworn there was a Nikon promotion on.

Second Hand Lenses

They looked pretty good and the market in second hand lenses looked strong (using eBay as the guide). One strength was that Nikon has kept the same mount since the 1970s so in theory, any lens would be compatible. Then I found an article about Nikon lens compatibility. On the D5000, only fairly recent lenses were going to be compatible. Canon, on the other hand had switched mounts when they introduced the EOS range in the mid-1980s and compatibility goes all the way back to then. So re-evaluating eBay, Canon came out on top. Probably not a major issue but I didn't want to cut myself out of part of the market - having a large range of second hand lenses to choose from enhances my ability to play around with gear (and spend less money doing so).

Design and Ergonomics

Secondly, when I held them, the 450D seemed to fit the hand better. The size of the grip and the positioning of the LCD screen off to the left helped with the fit. For someone with big hands, neither the D90 or D5000 seemed to be as comfortable. One nice feature of the D90 was the second dial for making adjustments to aperture, shutter speed, etc. Not that it's bad on the 450D, actually it's quite nice but maybe there are times when it's easier to reach the second dial.

I don't have to dig through menus very often to change settings. Even when I do need to access the menus, I usually only dig through one level (at most two) to get to what I want. Most of the settings I use regularly, I can add to a custom menu, which sadly is a bit too short. :(  (Am I being picky?)

Kit Lens

Yes, OK I could have spent less than I did and bought one of the cheaper kits. However, I'd read nothing good about the basic 18 - 55mm lenses that shipped with them. Plastic mount, lousy manual focus ring, lousy image quality. I also thought the twin kit was just going to be too much too soon (not having shot in 10 years).

The 17 - 85mm looked much better. A ring type USM motor, giving full time manual focus, a decent manual focus ring, metal mount and a bit more reach. Before I drowned my Olympus gear, I ended up settling with a 50mm f/1.8 and a 135mm f/2.8. The 17 - 85mm on the small sensor gives me the equivalent of about 24mm to 135mm so my old range was covered.

The only downside to this nice chunk of glass is that at 17mm it's not too hot. Zoom in a little and most of the problems go away.

All in all, it's a nice single lens kit (reportedly better than the Canon 18 - 200 kit option) and the IS makes up for some of the problems associated with the smaller maximum aperture (f/4 at 17mm, f/5.6 at 85mm).

File Formats

For an experiment, I tried shooting late into twilight recently in RAW + JPEG. Clearly, the RAW processing built into the camera is far superior to the JPEG processing. Areas of the JPEG that were burnt out or too dark, resolved detail in the RAW file. I wonder if a better camera would resolve JPEG files better.

My obsession

Well, this is quite a distraction from computing. I'm seeing the world differently and am looking for opportunities - often while driving my car. Seeing opportunities can be a challenge but I'm learning, again.

Cheers

01 December 2009

Liberal Party leadership spill

The Age reports that Tony Abbott (are you kidding?!) is the new leader of the federal Liberal Party. He one by a single vote, which apparently was informal. God help him if he ever refers to a mandate to lead the party.

http://www.theage.com.au/national/abbott-wins-liberal-leadership--by-one-vote-20091201-k1va.html?autostart=1

14 October 2009

Experimenting with IronPython

I’ve been playing around with IronPython for a little while now. Until recently, not really doing much more than spelunking around the .NET Framework with the IronPython console. It’s been great for exploring unknown parts of the framework as I can largely throw away the fluff associated with instantiating objects and bothering with types. Dynamic typing is just beautiful.

I’ve been engaged in a really sticky project – one of those projects that I wish I figured out some completely different way of doing – and by and large have been reasonably happy using C# and VB.NET to build it.

When I get completely jaded with Visual Studio and C# / VB, I’m discovering that SharpDevelop and IronPython make not only a refreshing change but really help with making progress when I’m building out a sticky piece of code. Previously, I’d open a fresh project in C# in Visual Studio and go from there. It’s OK, but switching to a dynamic language that is C-like has been great. SharpDevelop even has a console with (hallelujah!) Intelli-Sense! :-)

Once I figure out what I need to do, it’s not a big deal to translate the IronPython into C#. Now all I need to do is convince the boss that there’s no need to do the translation – IronPython is great just as it is (except for not supporting LINQ very well)!

Cheers!

25 September 2009

Review of ACS Victoria Branch Forum - Cloud Computing

I attended the Australian Computer Society (ACS) Victorian Branch (of which I am a member) forum this week. Paul Cooper from SMS Management and Technology presented on the topic of Cloud Computing.

He spoke about how cloud computing (of which there are many, incompatible brands) can bring IT costs closer to zero and how we can gain significant processing enhancements for little green cost. He indicated that cloud based infrastructures and applications can mean that a company can purchase capacity through operational expenditure rather than capital expenditure. He made the simplistic challenge early on to try to find a CFO who would rather spend Cap Ex on ICT instead of Op Ex. This would surely be a difficult challenge to meet.

The short summary of my opinion on the talk is that I didn't buy it.

Costs and benefits of reduced infrastructure cost

In arguing for reducing cost (for the company) closer to zero, he did not consider what costs this in itself may have.

Firstly, I think that it becomes too easy to simply buy more capacity. It may become more expedient to do this than to perform systems maintenance (eg. optimising databases, refactoring software, eliminating files and folders from the  backup schedule...) which would otherwise reduce the need for processing power and storage. On a smaller scale, this is already the case - while we can reduce the resource footprint of an application, the cost of developer / DBA time to do this is often greater than the cost of purchasing more powerful hardware.

My second view on this is that we are likely to get something similar to what is happening to music and video - that for many people, it no longer has much financial value. It's value is purely emotional / aesthetic. Some evidence of this is in the prevalence of music file sharing and the rate at which ring tones are purchased. Ring tones themselves are also only several bars of a song, not even a whole one. Bringing this back to IT, what happens in a business when IT no longer has significant financial value? The risk is that while reducing IT costs in infrastructure and transactional assets closer to zero, emphasis could shift away from IT as a focus for innovation within the business. I think that one of the drivers of IT innovation presently is that IT is not cheap and attempting to innovate with it (particularly in the strategic IT Portfolio asset class) is seen as a way of deriving value from the investment.

Out of sight, out of mind?

In implementing cloud infrastructure, we end up with a situation similar to plugging a 240v car into a power point at night. That is, you're not running a green car / IT environment; you're just outsourcing the environmental cost. No longer does the business pay for it, the cloud provider does.

Staying with the power generation analogy, what happens now when instead of paying the green cost of our power in the La Trobe Valley, we outsource that to another country? We have plenty of examples here. Ok Tedi mine in Papua New Guinea comes to mind. Could the effort to save the Franklin River have been applied as successfully? We're not short of examples where if we remove something from sight, we don't care about it anymore.

Paul talked about siting green data centres in places where green power can be harnessed; particularly geo-thermal power in Iceland or New Zealand. Several questions arise: 1. Do we need to destroy natural environments to construct power plants to harness this power source? 2. Is the expense going to be such that companies will prefer to use a dirty data centre in a developing or third world country on cost grounds?

Utility Computing

Paul presented cloud computing as a utility service, subscribing to the views of Nicholas Carr. I have already indicated that Paul dwelt on the cost savings that could be gained in reducing infrastructure capital cost, reducing direct power costs to the business and indirectly reducing transactional costs through gaining more processing power for the dollar / watt of power.

This model has parallels to Carr's idea that the sole aim of technology should be to drive cost as close to zero as possible. Carr's argument is that implementing information technology should be about as complex as "implementing" mains power. The assumption is that everyone uses technology in the same way. Which they don't.

He also envisaged a world where assembling enterprise applications was about as difficult as plugging components together (Lego style). There is evidence of this occurring already, though mainly in an the research stage (I think Microsoft Oslo is an example).

Strategic Benefits for a Company

Tele-working

He proposed that cloud computing enables tele-working. No it doesn't. One member of the audience suggested that with VPN, anyone can tele-work without cloud infrastructure. Paul responded by saying that the cost of this is an extra computer to function as the gateway. This in itself is not necessary. Often, the VPN gateway can be operated on another device. They can be embedded into a router or built into a server. Portal software such as SharePoint or WebSphere also allow a company to expose a significant portion of its data over the Internet to authenticated users. Furthermore, applications like Lotus Notes support a disconnected operating mode where not only email but whole databases can be taken off-line for remote, occasionally connected work.

Tele-work is not a sound business case for implementing a cloud based infrastructure.

"Overload" Processing

This is where real strategic benefit can be gained, though I think Paul did not emphasise this advantage sufficiently. Paul talked about how a company can set up a hybrid cloud, hosting some / many of your applications on internal infrastructure. He was interested in maintaining close to 100% utilisation and bring online externally hosted cloud resources as needed.

This sounds fair, though I think I'd rather have some spare capacity on my internal infrastructure as the normal load ebbs and flows; particularly processing load but also temporary and to a lesser extent permanent storage.

The advantage I see is for a company requiring significant processing resources intermittently. Perhaps an IT consultancy that periodically performs data cleansing / transformation activities on significant quantities of data on projects for clients could take advantage of cloud infrastructure to shorten the cycle times for these operations. They may only need these substantial resources for 6 to 10 hours per week. It would be uneconomical for this consultancy to have this kind of processing power in house, idle for most of the week. The alternative for this company may be to scale down the processing power and just have to make do with the longer cycle times. The risk here is that the live data evolves beyond the state of the data in the set being used for the transformation.

Summary

Ultimately, what do I think about cloud computing? Where is its value? What are the risks? What are the costs?

Cloud computing is being marketed as a cost reducing measure. While this is so, the case for cloud computing is thin. A cost can only be cut once and once everyone else does it, there's no longer any advantage. A company must then either maintain their expenditure on cloud computing, just to "keep up with the Jones's" or must find a way of innovating their way forward with it.

There are environmental risks as I have discussed and legislative risks which I have not discussed.

Availability of the services that exist to date is high, so this is not an issue. Availability will be mostly affected by the reliability of the business's Internet gateway. A DOS attack on a company router that carries hosted ERP or CRM (or Google Apps) traffic will shut down the business until service can be resumed. At least everyone will have time to catch up on filing.

The success of cloud computing will depend on how it can be used to create sustainable competitive advantage for a company that chooses to innovate with it. In order to be more than just a passing fad, argument needs to evolve from the current cost cutting and Green IT issues and demonstrate how it can strategically enhance a company.

18 June 2009

CSIRO pursues WiFi royalties | Australian IT

Good news for CSIRO, the Australian Government science and research body. For years, several technology companies (including Dell, Microsoft, Intel and HP) have tried to invalidate patents CSIRO has in the USA concerning wireless LAN (802.1x) standards.

These companies have argued (unsuccessfully) that government patent belongs in the public domain.

Surely, a loss for CSIRO would have had every single government in the world wondering what value there is for government research.

What will happen now? Surely the group of companies involved will appeal the decision. There’s so much money at stake for them. Meanwhile, CSIRO is targeting the next group, likely to include smartphone manufacturers such as Nokia, Motorola and Samsung.

CSIRO pursues WiFi royalties Australian IT

11 June 2009

Use VB 9 XML Literals to parse an XML tree of unknown depth

I’m in the process of building an editor interface for a Flex program we’ve written at my work. It’s my first real project using LINQ. I’ve mainly been using C# but decided to take advantage of XML Literals in VB 9.

The structure of the Flex program is held in an XML file. I’ve been needing to read this XML file in such that the whole navigation structure can be held in a tree control. This is different to the Flex program itself which only needs to display part of the navigation tree at once.

To do this, I needed to transform the tree. As I saw it, I had two options: XSLT (yuck) or LINQ.

The XML doc is structured like the example below.

<sections>   
<section>
<id>section1</id>
<label>This is section 1</label>
<layout>navStructure</layout>
<properties>
<tree>
<!-- Nested collection of node elements -->
<node label="Blah">
<node label="Blah 1" url="somedoc.pdf"/>
<node label="Blah 2" url="somedoc2.pdf"/>
</node>
</tree>
</properties>
</section>
<!-- More section elements -->
</sections>

There are several <section> elements with this layout. The <node> elements also have variable nesting levels.

The hard part is in writing a LINQ query that will find all the <node> elements without knowing the nesting depth beforehand.

This first section of VB is in the calling Sub. Note in the second expression hole that calls out to the BuildChild function.

The BuildChild function takes an XElement as a paramater and writes a <node> element with the label and url attributes. An expression hole is then opened which recurively calls back into the BuildChild function on the collection of XElements contained in the current XElement.

Dim doc As XDocument = XDocument.Load("content.xml")
Dim tree = _

<tree>
<%= From n In doc.<module>.<sections>.<section> _
Where n.<layout>.Value = "navStructure" _
Select <node label=<%= n.<label>.Value %> url="">
<%= From m In n.<properties>.<tree>.<node> _
Select BuildChild(m) _
%>
</node> _
%>
</tree>
Private Function BuildChild(ByVal element As XElement) _
As XElement
Dim retElement As XElement

retElement = <node label=<%= element.@label %> _
url=<%= element.@url %>>
<%= From e In _
element.Elements
Select BuildChild(e) %>
</node>

Return retElement
End Function

This is how I’ve solved my problem for now. My understanding is that LINQ is more closely related to functional programming in its feel and I suspect that it can provide a better solution. Recursion is difficult to read and debug and it would be great to get rid of it if possible.

Cheers
Mike