Monday, August 24, 2009

Our Quality Vision (and Addressing Our Quality Past)

Like all software companies, we at Space-Time Research have juggled customer demands, complex software, very different uses of our software, and ever changing requirements. This has sometimes resulted in us delivering release software to our customers that is not of a sufficient quality, and later than we planned.

In the past, and as recently as our 6.3 release of our software, our testing group has passed a release and the software has been delivered to a customer and then a critical issue has been found. One of the main reasons this happens is that every customer has a slightly different environment. We currently support Solaris, Red Hat Linux, Windows 64 bit, Windows 32 bit, Windows XP and Vista for our client applications, browsers including IE6, IE7, IE8, Chrome, Firefox, Safari. We read data from any relational database that has a jdbc driver including Oracle, SQL Server, DB2 and others, plus different types of text files. We provide mapping with ESRI ArcIMS, ArcGIS Server, Google Maps and soon Bing Maps. We test all these environments and on our servers, our testing can pass.

Then we get out to the customer environment and encounter different environments & constraints. Not everyone can host a Tomcat application and we might have to hook to IIS. Firewalls might be an issue. Ports might be an issue. The client might operate in a remote way. Even if we don't officially support a configuration, our clients will implement that way anyway and it's up to us to sort it out.

Once we have the software successfully installed and configured at a client site, they then build some databases and work out how they are going to analyse or visualise their information. Every client has different types of databases, structures and uses of their information. Our testing doesn't cover every different type of database - we try to, but of course we don't cover everything. So sometimes we miss things - heirarchical summation options being a recent example.

Finally, our customers use the software with their own workflow. We follow a standard workflow with our automated tests, and then we conduct exploratory testing that mimics what a customer would do, but as we are not the customer, we don't always get that exactly right either.

So, how do we improve it? What have we done and what are we doing next?

Firstly, for our 6.5 General Availability Release, Space-Time Research defined the following quality vision:
  • Timely, relevant, functioning software that works!
  • Performance, stability and resiliency focus.
  • Deliver releases of SuperSTAR that are perceived within STR and by our partners and customers as better than the previous release.
All decisions about testing, and then which bugs we fix, and when we release our software, are related back to the quality vision.

We implemented a partnership approach with some selected customers to enable them to test pre-release versions of our software. We conducted fortnightly builds, ran a couple of days of testing and then made the builds available to the customer. Builds were provided via FTP site, and customers were able to download the software and install in their own test environments. The customers were able to choose whether they would take a build or not. STR also hosted versions of our web applications so customers could do user interface testing without having to run their own installation and configuration.

The customers reported bugs, severity and their own priority via our normal support channel (via email to support@spacetimeresearch.com). We regularly triaged the bugs reported, and communicated via conference call with each customer to advise what we intended to do, or discuss concerns.

The benefits of this approach were clear for each customer involved:
  • Integration and configuration issues were ironed out during the pre-release phase.
  • Customer-focused testing found issues we would never have found.
  • The end delivery held no surprises.
  • We delivered on time to those customers and met their deadlines.
6.5 General Availability release is almost complete on all platforms. I'll do another blog and announcement about that separately.

For our next release, we are implementing a fully agile development process. Another blog on that is coming too! But for our customers, please know that we want to:
  • Involve more customers in pre-release testing.
  • Collect more sample databases from customers.
  • Collect reference data sets from customers so we can validate our statistical routines.
  • Use client test beds for complex or unusual environments.
  • Open up our change management and support processes so customers can track issues they are interested in.
cheerio

Jo


Wednesday, August 19, 2009

Open Data Initiative - Free SuperVIEW hosting of data


Space-Time Research this week launced a new program called the Open Data Initiative at the International Statistical Institute (ISI) 2009 conference in Durban.


What is the Open Data Initiative?
The Open Data Initiative is a Web 2.0 site for disseminating public data. Users discover and explore data in a rich, interactive, and intuitive application, rather than browse or read large documents of published tables and charts. The end user can select and visualize any combination of data. It can be exported, printed, linked to, and shared in collaboration environments.

The Open Data Initiative is a freely available online service for the creation and dissemination of data for public consumption. You have the data; we have the service to disseminate it to the public.

The Open Data Initiative is hosted on the Google AppEngine Cloud, enabling providers of public data to create engagingand rich Web 2.0 experiences built on top of Space‐Time Research’s SuperVIEW product suite. This provides transparent, lightning‐fast web traffic responsiveness, scalability and built in redundancy no matter where in the world you are.

Data types suitable for the Open Data Initiative: Health, Transport, Education, Agriculture, Population Statistics, Labour Force, etc.

How do I sign up?
Contact us via the Open Data Initiative website

Key Benefits. The Open Data Initiative:
  • Is Cost and Time Efficient — Reduces the workload on your data analysts and researchers.
  • Provides Data that is Complete — Why compromise on providing a subset of the data? Maximize the ability of the public to self‐service data of personal interest.
  • Provides Data as Service — Now you can provide a new online data service to the public.
  • Protects the Relevance of Your Brand — Provide an engaging and rewarding experience for the public. This reinforces the relationship of trust they have in your organization.
  • Delivers Data Integrity — Have confidence that the public are seeing the right numbers, graphs, and maps, andreaching the correct interpretation and understanding behind those numbers.
  • Delivers Data Responsiveness — Minimize the time between data collection and data dissemination to ensure maximum relevancy of the data to the audience.
  • Creates Communities of Users — Ensure the online experience can be captured and shared by the public incollaborative environments from Blogs to Twitter.

Frequently asked questions coming from some of our early adopters:

Q. What is the business model for Space-Time Research?
A. This is a free service and as such it has business model restrictions for customers - they cannot charge a fee for access to their created sites. It must be public and not sit behind authentication or payment gateways. We have a paid service available that overcomes these restrictions but this is a good way to test drive the technology and the dissemination approach using the free service initially. Alternatively customers can purchase a paid SuperVIEW software license and implement their own business model around a deployed SuperVIEW.

Q. What about confidentiality?
No confidentiality capabilities are offered with the free SuperVIEW. The Open Data Initiative will host all data in the Cloud so by it's nature data provided should not contain confidential information. We can provide a confidential Cloud based service using our Hybrid connector, but this becomes a paid solution engagement.

Q. How do statistical boundaries get loaded?
We will detail this in the data collection process over the next week with people that sign up to our early adopter program, but think it will be along the lines of providing a shapefile (with some size limits -- i.e. pre-simplified and for particular areas) or KML to us.

Q. How does the application get integrated with the data providers website.
Option 1 -> provide a link that takes the user from the data provider website to the Open Data Initiative website.
Option 2 -> use an IFRAME to embed the Open Data Initiative hosted site into their website.

Friday, August 7, 2009

My favourite Cloud Reading List / Resources

A sampler of general resources I have found useful:

A great site - good definitions, interesting information. Written mostly by vendors. I'm signed up to be a contributor.

I'm not a fan of the word manifesto but this is a group that believes that cloud offerings should be open and compatible with each other. Good definitions as well, and good ideas.

Very interesting articles on this site, but as you will see, the advertising and navigation is annoying. Persevere.

Specfic articles / resources:
I found this when I googled "does google app engine have an SLA". Gold.

Gartner's 7 cloud computing risks article

Another beware of risk article - this time from an Australian Government perspective.

The definitive pricing document for Google App Engine and how the quotas work.

Kundra's continued support of the idea is what makes me think it's all going to happen.

Amazon EC2 links:

Please feel free to add your own.

SuperSTAR and Cloud - nutting through the details of Google Apps Engine

I have spent the past week further working through the details of cloud offerings and how it can be used by existing and new SuperSTAR customers. It's still a hot topic, and I'm still sure it's a really good thing and something that we should be offering our customers if they want it.

So far, Space Time Research has put our SuperVIEW software into the Google App Engine cloud and demonstrated that it works just fine. We're confident that we're addressing many of the security concerns associated with clouds because we've chosen to go with a hybrid model, where the core database and SuperSERVER software still resides in the client's own server environment. It's just the application itself that is in the cloud. The data that is passed to the application is aggregated and encrypted already and something that would be passed to the web browser and then to the user regardless of whether the app is hosted in a cloud or not. So most of the security / data location concerns are not an issue with this model. [Aside: it will be for our SuperWEB product so we need to come up with a different way of handling that.]

At the risk of repeating myself, I do see some clear benefits of using this model over an internally hosted web server:
  • For clients who already have a SuperSTAR infrastructure, external web hosting can sometimes be difficult to arrange. This is an easy and inexpensive way to get around it.
  • Clients can take advantage of the scalability offered and handle peak loads without having to buy massive servers.
  • Of course, there's others. They're in last week's blog.
So now we have a viable demonstration to show customers. The next question I had to answer was - if a customer wants it, would it work for them in a production setting? How scalable is it really? Is it true that there is no SLA for the Google App Engine? And how much would it cost and for what? Here's what I found out:
  • It's true that there is no SLA for the Google App Engine. I reckon this rules out half of our customers straight away. Especially those who are data providers like the Australian Bureau of Statistics and want to reliably provide access to data and analytical tools to the world 24/7. Other customers, such as those who use our software for internal or researcher use, or those who are just starting out with SuperVIEW, might take this risk on board and try it out.
  • It's really difficult to work out what it would actually cost. Everything is costed by usage per day and there's the option of getting a free service that then jumps into a paid service, or a paid service that gets more expensive as your user base grows. What we did work out was that it would be free for most SuperVIEW applications up to 2,000 user sessions per day. After that, it would cost approximately $300 USD per month to add an additional 1,000 users.
Conclusion:
Using the Google App Engine for SuperVIEW is a good way for us to get started with a real cloud offering. We can do this cheaply, and we can pilot and test it over the next 6 months with some of our customers. Space-Time Research needs to provide alternatives for our customers so they can make informed decisions about which method to go with.

We know we have cloud-enabled software and maybe this is enough for now. Our software doesn't work on every cloud (e.g. by definition it won't work on Microsoft Azure because our application is java-based), but it operates in virtual environments and can be hosted securely in some way outside a customer's server environment.

We need to understand more about the potential governmental restrictions - in particular what it means for Australia, the US and the EU which is where most of our customers come from. And what our customers expect us to do for them - do they want us to find the provider, or just provide assurance that our software will work in the cloud of their choosing? There is no point in us going down a path of recommending certain providers and finding out that the government would NEVER choose them.

A whole other topic, and one I will not write about but the gov2.0 thinkers are writing about, is the public servant aversion to risk and change that means that it's possible no-one wants to really do this anytime soon.

Next steps:
  • I am preparing a white paper for our sales team and customers on what we offer now and intend to offer in the future. This will have enough technical details to be able to talk to project owners / sponsors, but IT representatives will need more detail.
  • I'm finding out as much as I can about what governs our customer decisions now. I'm keen to get help on that because it's really hard to find out.
  • I'm talking to Gartner analysts to get their take. Particularly as the article I'm sent most often is one that Gartner wrote about the security concerns of cloud and what to watch out for.
  • I'm talking to a real cloud provider - Telstra - who are an Australian provider and who are going to host all of Visy Recycling's applications which is a significant move.
  • I'm going to ask questions of the Australian gov2.0 taskforce and see what they think about it.
  • I'm going to keep reading the articles I get sent every day.

Reflection:
Even though there is so much information out there, and so many articles and blogs being written about cloud computing at the moment, there is no one-stop answer shop that answers my questions. As I've been searching and figuring things out this week, I've really wished that I could just pick up the phone and call the Google App Engine product manager and talk to a real person. As I write, I realise I could have at least tried. This week's resolution is to ask more questions of real people via whatever means is appropriate.

Contributors