14 October 2009

Experimenting with IronPython

I’ve been playing around with IronPython for a little while now. Until recently, not really doing much more than spelunking around the .NET Framework with the IronPython console. It’s been great for exploring unknown parts of the framework as I can largely throw away the fluff associated with instantiating objects and bothering with types. Dynamic typing is just beautiful.

I’ve been engaged in a really sticky project – one of those projects that I wish I figured out some completely different way of doing – and by and large have been reasonably happy using C# and VB.NET to build it.

When I get completely jaded with Visual Studio and C# / VB, I’m discovering that SharpDevelop and IronPython make not only a refreshing change but really help with making progress when I’m building out a sticky piece of code. Previously, I’d open a fresh project in C# in Visual Studio and go from there. It’s OK, but switching to a dynamic language that is C-like has been great. SharpDevelop even has a console with (hallelujah!) Intelli-Sense! :-)

Once I figure out what I need to do, it’s not a big deal to translate the IronPython into C#. Now all I need to do is convince the boss that there’s no need to do the translation – IronPython is great just as it is (except for not supporting LINQ very well)!

Cheers!

25 September 2009

Review of ACS Victoria Branch Forum - Cloud Computing

I attended the Australian Computer Society (ACS) Victorian Branch (of which I am a member) forum this week. Paul Cooper from SMS Management and Technology presented on the topic of Cloud Computing.

He spoke about how cloud computing (of which there are many, incompatible brands) can bring IT costs closer to zero and how we can gain significant processing enhancements for little green cost. He indicated that cloud based infrastructures and applications can mean that a company can purchase capacity through operational expenditure rather than capital expenditure. He made the simplistic challenge early on to try to find a CFO who would rather spend Cap Ex on ICT instead of Op Ex. This would surely be a difficult challenge to meet.

The short summary of my opinion on the talk is that I didn't buy it.

Costs and benefits of reduced infrastructure cost

In arguing for reducing cost (for the company) closer to zero, he did not consider what costs this in itself may have.

Firstly, I think that it becomes too easy to simply buy more capacity. It may become more expedient to do this than to perform systems maintenance (eg. optimising databases, refactoring software, eliminating files and folders from the  backup schedule...) which would otherwise reduce the need for processing power and storage. On a smaller scale, this is already the case - while we can reduce the resource footprint of an application, the cost of developer / DBA time to do this is often greater than the cost of purchasing more powerful hardware.

My second view on this is that we are likely to get something similar to what is happening to music and video - that for many people, it no longer has much financial value. It's value is purely emotional / aesthetic. Some evidence of this is in the prevalence of music file sharing and the rate at which ring tones are purchased. Ring tones themselves are also only several bars of a song, not even a whole one. Bringing this back to IT, what happens in a business when IT no longer has significant financial value? The risk is that while reducing IT costs in infrastructure and transactional assets closer to zero, emphasis could shift away from IT as a focus for innovation within the business. I think that one of the drivers of IT innovation presently is that IT is not cheap and attempting to innovate with it (particularly in the strategic IT Portfolio asset class) is seen as a way of deriving value from the investment.

Out of sight, out of mind?

In implementing cloud infrastructure, we end up with a situation similar to plugging a 240v car into a power point at night. That is, you're not running a green car / IT environment; you're just outsourcing the environmental cost. No longer does the business pay for it, the cloud provider does.

Staying with the power generation analogy, what happens now when instead of paying the green cost of our power in the La Trobe Valley, we outsource that to another country? We have plenty of examples here. Ok Tedi mine in Papua New Guinea comes to mind. Could the effort to save the Franklin River have been applied as successfully? We're not short of examples where if we remove something from sight, we don't care about it anymore.

Paul talked about siting green data centres in places where green power can be harnessed; particularly geo-thermal power in Iceland or New Zealand. Several questions arise: 1. Do we need to destroy natural environments to construct power plants to harness this power source? 2. Is the expense going to be such that companies will prefer to use a dirty data centre in a developing or third world country on cost grounds?

Utility Computing

Paul presented cloud computing as a utility service, subscribing to the views of Nicholas Carr. I have already indicated that Paul dwelt on the cost savings that could be gained in reducing infrastructure capital cost, reducing direct power costs to the business and indirectly reducing transactional costs through gaining more processing power for the dollar / watt of power.

This model has parallels to Carr's idea that the sole aim of technology should be to drive cost as close to zero as possible. Carr's argument is that implementing information technology should be about as complex as "implementing" mains power. The assumption is that everyone uses technology in the same way. Which they don't.

He also envisaged a world where assembling enterprise applications was about as difficult as plugging components together (Lego style). There is evidence of this occurring already, though mainly in an the research stage (I think Microsoft Oslo is an example).

Strategic Benefits for a Company

Tele-working

He proposed that cloud computing enables tele-working. No it doesn't. One member of the audience suggested that with VPN, anyone can tele-work without cloud infrastructure. Paul responded by saying that the cost of this is an extra computer to function as the gateway. This in itself is not necessary. Often, the VPN gateway can be operated on another device. They can be embedded into a router or built into a server. Portal software such as SharePoint or WebSphere also allow a company to expose a significant portion of its data over the Internet to authenticated users. Furthermore, applications like Lotus Notes support a disconnected operating mode where not only email but whole databases can be taken off-line for remote, occasionally connected work.

Tele-work is not a sound business case for implementing a cloud based infrastructure.

"Overload" Processing

This is where real strategic benefit can be gained, though I think Paul did not emphasise this advantage sufficiently. Paul talked about how a company can set up a hybrid cloud, hosting some / many of your applications on internal infrastructure. He was interested in maintaining close to 100% utilisation and bring online externally hosted cloud resources as needed.

This sounds fair, though I think I'd rather have some spare capacity on my internal infrastructure as the normal load ebbs and flows; particularly processing load but also temporary and to a lesser extent permanent storage.

The advantage I see is for a company requiring significant processing resources intermittently. Perhaps an IT consultancy that periodically performs data cleansing / transformation activities on significant quantities of data on projects for clients could take advantage of cloud infrastructure to shorten the cycle times for these operations. They may only need these substantial resources for 6 to 10 hours per week. It would be uneconomical for this consultancy to have this kind of processing power in house, idle for most of the week. The alternative for this company may be to scale down the processing power and just have to make do with the longer cycle times. The risk here is that the live data evolves beyond the state of the data in the set being used for the transformation.

Summary

Ultimately, what do I think about cloud computing? Where is its value? What are the risks? What are the costs?

Cloud computing is being marketed as a cost reducing measure. While this is so, the case for cloud computing is thin. A cost can only be cut once and once everyone else does it, there's no longer any advantage. A company must then either maintain their expenditure on cloud computing, just to "keep up with the Jones's" or must find a way of innovating their way forward with it.

There are environmental risks as I have discussed and legislative risks which I have not discussed.

Availability of the services that exist to date is high, so this is not an issue. Availability will be mostly affected by the reliability of the business's Internet gateway. A DOS attack on a company router that carries hosted ERP or CRM (or Google Apps) traffic will shut down the business until service can be resumed. At least everyone will have time to catch up on filing.

The success of cloud computing will depend on how it can be used to create sustainable competitive advantage for a company that chooses to innovate with it. In order to be more than just a passing fad, argument needs to evolve from the current cost cutting and Green IT issues and demonstrate how it can strategically enhance a company.

18 June 2009

CSIRO pursues WiFi royalties | Australian IT

Good news for CSIRO, the Australian Government science and research body. For years, several technology companies (including Dell, Microsoft, Intel and HP) have tried to invalidate patents CSIRO has in the USA concerning wireless LAN (802.1x) standards.

These companies have argued (unsuccessfully) that government patent belongs in the public domain.

Surely, a loss for CSIRO would have had every single government in the world wondering what value there is for government research.

What will happen now? Surely the group of companies involved will appeal the decision. There’s so much money at stake for them. Meanwhile, CSIRO is targeting the next group, likely to include smartphone manufacturers such as Nokia, Motorola and Samsung.

CSIRO pursues WiFi royalties Australian IT

11 June 2009

Use VB 9 XML Literals to parse an XML tree of unknown depth

I’m in the process of building an editor interface for a Flex program we’ve written at my work. It’s my first real project using LINQ. I’ve mainly been using C# but decided to take advantage of XML Literals in VB 9.

The structure of the Flex program is held in an XML file. I’ve been needing to read this XML file in such that the whole navigation structure can be held in a tree control. This is different to the Flex program itself which only needs to display part of the navigation tree at once.

To do this, I needed to transform the tree. As I saw it, I had two options: XSLT (yuck) or LINQ.

The XML doc is structured like the example below.

<sections>   
<section>
<id>section1</id>
<label>This is section 1</label>
<layout>navStructure</layout>
<properties>
<tree>
<!-- Nested collection of node elements -->
<node label="Blah">
<node label="Blah 1" url="somedoc.pdf"/>
<node label="Blah 2" url="somedoc2.pdf"/>
</node>
</tree>
</properties>
</section>
<!-- More section elements -->
</sections>

There are several <section> elements with this layout. The <node> elements also have variable nesting levels.

The hard part is in writing a LINQ query that will find all the <node> elements without knowing the nesting depth beforehand.

This first section of VB is in the calling Sub. Note in the second expression hole that calls out to the BuildChild function.

The BuildChild function takes an XElement as a paramater and writes a <node> element with the label and url attributes. An expression hole is then opened which recurively calls back into the BuildChild function on the collection of XElements contained in the current XElement.

Dim doc As XDocument = XDocument.Load("content.xml")
Dim tree = _

<tree>
<%= From n In doc.<module>.<sections>.<section> _
Where n.<layout>.Value = "navStructure" _
Select <node label=<%= n.<label>.Value %> url="">
<%= From m In n.<properties>.<tree>.<node> _
Select BuildChild(m) _
%>
</node> _
%>
</tree>
Private Function BuildChild(ByVal element As XElement) _
As XElement
Dim retElement As XElement

retElement = <node label=<%= element.@label %> _
url=<%= element.@url %>>
<%= From e In _
element.Elements
Select BuildChild(e) %>
</node>

Return retElement
End Function

This is how I’ve solved my problem for now. My understanding is that LINQ is more closely related to functional programming in its feel and I suspect that it can provide a better solution. Recursion is difficult to read and debug and it would be great to get rid of it if possible.

Cheers
Mike

10 June 2009

Exploring open source - Part 2

Firstly, a little about my journey with open source software. I'm by no means an "expert" (def: an unknown drip under pressure) but neither am I a total ignoramus.

Intro to Linux - 2001

I came across Linux during my Diploma in IT in 2001 / 2002 where we were exposed to Red Hat 5 or 6 or thereabouts. Here is where I was introduced to the concept of open source software. It didn't last long as I figured that as we lived in a Windows world, I had to keep up with Windows and prioritised my efforts here. Getting a job during the tech-wreck was work enough.

The idea of open source in the Windows world certainly wasn't obvious to me then, so I can't say what the state of it was back then.

Asterisk Implementation

My first real use of open source was when my boss in 2005 set me the task of developing the company knowledge of the VOIP PBX Asterisk. Four years after playing with Linux, I was still able to remember enough to get going quickly. (hooray!!!) I actually managed to get a working installation off the ground, complete with an exim4 service to handle voicemail.

Current Open Source Usage

Since then, in the Windows world, I've been playing with open source software mainly in the software development space but also a little in the support space. Principally, I built a company intranet in DotNetNuke (DNN), a wonderful platform for hosting content management systems, based on VB.NET. Otherwise, my main use for open source has been for development support tools like Subversion and TortoiseSVN (of course), MbUnit, NUnit, NMock2 (yeah so what if Rhino Mocks is "better" - I like the expressiveness of NMock2 and you gotta start somewhere), SharpDevelop and am now dipping into IronPython (the Python implementation for the .NET Framework). I also configured and deployed a helpdesk / support tool in PHP and MySQL (of questionable quality). I'm sure there are others that elude me for now. Some of these tools are either ports from Linux / UNIX apps while others are ports from (or inspired by) Java tools (possibly open source).

It's interesting to note that the purpose of all of these tools have been to make the life of IT staff easier or more productive. No open source software that I have been in contact with has been for the purpose of enhancing end user productivity. DNN is kind of the exception, although its reason for existence was to remove me from the process of posting new content to the intranet. I learned that the IT staff should not be an essential part of the business process. Implementing DNN meant that users could post content themselves with little intervention from IT.

Open Office.org

On the open source front, I have dipped into Open Office several times since version 1.0 (possibly a little earlier) and every time have also dipped out. For a while I used it while studying part time in 2007 while holding down a full time job in preference to my other poor choice - Microsoft Office 97. I continually found that menu choices were hiding somewhere unexpected and getting help was difficult. At least as difficult if not more so than Microsoft Office. The final straw was when editing a table was more difficult than it should have been. I ditched it in favour of Office. At least the thing worked according to my expectation.

My most recent exploration with Open Office was last week when I investigated version 3.1. I opened a moderately complex Word doc, a document from Microsoft “Deployment and upgrade for Office SharePoint Server 2007” (link). It contains an automatically generated table of content, hyperlinks hidden behind display text and some pictures. There might be some tables and bulleted lists. When I opened it, some of these hyperlinks had been inexplicably changed from Arial to Symbol font.

Apologies to the Open Office folks, but this is a deal breaker. If I'm in a position of specifying a new office suite to a company, I am not going to recommend deploying a product that does this. It just has to do this right.

Summing Up

If you get me on a bad day, I'll be a Microsoft bigot and will curse open source to the ends of the earth. It's wrong for me to do this and once I calm down, I'll tell you that I'll consider it if:

  • it works reliably
  • I'm looking for software that will do a task better
  • I don't hate the world on the day you ask. ;)

Till next time
Cheers
Mike

09 June 2009

Exploring open source

I’ve spent several days recently in the company of an avid open source implementer and advocate. Until recently, he featured on an Australian radio show as the resident open source (ie. Linux) geek.

Anyway, I’m very firmly in the Microsoft camp so we ended up in an argument over the worth of open source software.

The best thing that came of it was I decided that I really should know about the state of play in the open source world and know what I think (and be able to support it) about open vs. proprietary (or closed source) software. As an designer and implementer of solutions, I feel that I really need to know if I am to make a sound business decision whether to employ it or to avoid it in a given situation.

My pre-condition to this discussion is that open source or proprietary software bigotry has no place in IS design and implementation. It only serves to remove tools from our toolkits.

What I hope to cover in what I expect will become a series of posts will include:

  • I want to know what I think about the various licences that exist (eg. GPL, LGPL, MIT, MSPL, etc)
  • How can open source contribute to a business in terms of
    • cost savings
    • competitive advantage
  • How can a company make a living out of open source development
  • Is open source really more secure because anyone can read the code?

At the end of all this, my expectation is that I will have a better understanding of open source and have some opinions which I can support with evidence of where and when I’d consider employing it.

30 April 2009

SOLID principles are about habits

Last February there was a furore in the .NET world that started when Robert C. Martin (Uncle Bob) went on Hanselminutes #145 (link). Joel Spolsky later went on the Stack Overflow podcast and was widely quoted as saying that quality doesn’t matter much. And the blogosphere went nuts…

Remembering back to when I learned to program – I learned syntax and so on. While we were instructed to focus on design, we were never really taught how or what constitutes good design (of which the SOLID principles are part of). So I never developed good design habits. Now that I’m trying to learn and apply them, it’s more difficult as I already have some bad ones. It’s like learning to ski as an adult while watching 5 year olds zoom past backwards… it’s so much easier to learn early on.