Blocking Production Issues

You specify what to be developed. Users review it. You develop it. Architect reviews the solution. You do unit tests. You finish development. You do integration tests and functional tests. You do non-regression tests. You deploy in pre-production. You ask your users to do UAT. They do UAT. Everything is OK. Then happily users give Go for production. You do deployment in production. All is well.

User starts using it. With in an hour you get a blocking problem – that had to be fixed ASAP. How is it possible? The problem is with the production environment. That’s why I say, after pre-production, install in production but please don’t give production to the users; instead give pre-production to the users – that’s where everything is alright for everybody.

Seriously – what’s wrong and where are we making mistakes?

Writing right test cases is one important thing. If you don’t have good business analysts who write great test cases then better do testing with the right production data for the actual scenario. For example, if you expect your users to upload a file with their own data, try to get their file which they’ll upload in production and try to upload it against the existing production database dump. It is like HIT – you’ll see all the cockroaches coming out in all directions.

Mostly the users will say that they haven’t prepared that file yet. If they haven’t prepared the file yet for UAT, how are they going to upload to production just after 5 days? Force them to prepare the file. One other advantage is you’ll see what’ll happen in production on the D-Day – including any performance hurt lockers.

ADT Crashes when creating APK

In  the default settings of ADT, every time you try to create an APK, it silently crashes. Incredible! I tried few things before I hit Google – nothing worked. The problem is, in default settings, “Build Automatically” is enabled. So disable it – and export your ADT. But I am still wondering, how could they do this? While creating APK, they cannot disable the “Build Automatically” settings by default? Small things matter.

Saturated business?

There is no such thing as saturated business. When everybody thought e-mail business is over crowded and saturated, Gmail came in – and took a lion’s share in a really very short period. How did that happen?

They looked at the existing problem – and they fixed it. The problem before Gmail came was space and user experience. The giants were just offering a meek 5 MB plan for free – and the user experience was really horrible. Gmail fixed both of them. They offered 1 GB – and kept on increasing it – and gave a smooth user experience with Ajax and stuff. An important point to note here is, yes they offered 1 GB, but who is going to use that 1 GB immediately – a very less percentage of users. At least, when they announced the plan, they are not going to allocate 1 GB to every body. That’s it.

Find the missing but important feature – and address it. Your business will be successful – you can fight with lions and win the battle.

I was searching for a training course for Sybase in Singapore.In most of the IT training courses, there are three levels.

1. Admin level
2. Admin – developer level
3. Developer level

In case of Sybase, admin level training is how to back up, restore etc. The third level is mostly writing efficient queries.

But the second level is more important. This is where fine tuning and optimization comes in – which involves the collaboration of both admin and developer and this where the real work is. Explaining real time problems and helping the professionals to fix them. But most of the current available training courses are not addressing this.

This is true for other training courses too.

Scrum dashboard for a team with out a common goal

When the team does not have a common goal, how to fix a scrum dashboard?

I am setting up a new team here in my office. The new team returned from their 2 months training from Paris. It is a vibrant team with young Java professionals. As the project is quite big, we have enough modules for each person to handle for time being – that means each one of the team member will work in their own set of modules for a while. Each one has his delivery date etc.

The team does not have a common goal. How to create a project scrum dashboard for a team with out common goal?

There were proposals to group the tasks by individuals. For example, if Rupesh will work on downstream integration while Calvin will work on Rollback, then the dashboard will look like

Name To Do In Progress Test Done
Calvin  Rollback-scriptRollback-UI Change
Rupesh Downstream-integration

(Yes – I like to have test column in all of my dashboards. Developers will move the task to test once they have developed, tested. Now the task is ready for testing by BA.)

Now. the dashboard looks great – with the tasks listed down for each developer.  Do you see a problem?

The problem is, this kind of dashboard will lead to a situation where the developers just wait for their turn to talk in the morning scrum. They don’t pay attention to others – but morning scrum is created exactly for this reason.  I have seen teams using morning scrums to update status to the team lead.  Every body talks to the team lead – who acts as a scrum master.  Scrum master isn’t a lead – he is a facilitator for the team.  He identifies blocking points – and helps to fix them.

During the scrum meeting, the team has to talk to itself. Every body has to pay attention to others – to see if they can help them. That’s why we do planning meetings and retrospective together while every body gets an idea of what’s going on.

Even though each one has his goal, they are still together as a team and the modules they are working on are interconnected.  As a team they are supposed to know what others are doing and to help to unblock a fellow team member when he is blocked.

When the dashboard is grouped by the modules, I believe there will be more interaction with in the team than grouped by individuals. But this will have a bit of impact on getting the status of each goal, but as this is not a permanent setup for the team, for now advantage from team interaction is much more than that. Team integration is important for a new team.

What do you think?

TestDouble

Today I came across a word named TestDouble – I was wondering what it was! I had no idea – obviously I knew it had something to do with Unit Tests or Fitnesse Tests!

Test Double is a generic term for any case where you replace a production object for testing purposes

There are different types of TestDoubles :

1. Dummy Object

2. Test Stub

3. Test Spy

4. Mock Object

5. Fake Object

One of the common case, where we use TestDouble is when we need communication with external services.

More information here!

Generate insert statements for existing data

Once in a while I get this query: Do you know any tool to generate scripts for data in an existing table, of course for SQL Server?

Well, There are quite a few options.

If your company is rich enough to sponsor a tool then you could go for Redgate SQLToolBelt. This has quite a nice set of tools to generate scripts from existing data or new data and to compare scripts etc. 

You could also try out the free tool :  http://www.ssmstoolspack.com But this works only with SQL server Management Studio 2005 SP2 or above. 

If you are using SQL Server 2008 you could use the built in option SSMS in 2008. This is a quick run through to generate Insert statements for all of the data in your table, using no scripts or add-ins to SQL Management Studio 2008:

  1. DATABASE NAME: Right Click
  2. TASKS: GENERATE SCRIPTS
  3. Under Table/View Options: Set SCRIPT DATA = TRUE

You will then get the create table statement and all of the INSERT statements for the data straight out of SSMS.

You can also use this simple stored procedure given generously by Vyas.  The procedure that does the script generation is here.

I am using it for quite sometime now. Works perfect for my requirement!

TFS build Sservice account change gives workspace in use error

When we changed the existing service account we were using for TFS build agent and used a new service account. After this change our continous integration failed with the following error:

Problem:
C:Program FilesMSBuildMicrosoftVisualStudioTeamBuildMicrosoft.TeamFoundation.Build.targets(699,5,699,5): error : The working folder [WorkingFolder] is already in use by the workspace [workspace];[domain][user] on computer [buildmachine]

Reason:
Part of the Team Build build process involves creating a workspace that can be used to get sources for the build. This workspace is typically deleted and then created during the course of the build, meaning that after the build the workspace hangs around. So – when you changed the service account, the delete for the next build had nothing to do (since workspaces are owned and the current user didn’t have a workspace to delete) and the create failed, since a workspace already existed in the same location. You’ll just need to delete the old workspace, owned by DEVtfssetup

I found the above excellent tip from the this page.

And the correct command to execute the deletion of workspace is as follows:

C:Program FilesMicrosoft Visual Studio 9.0Common7IDE>tf workspace /delete /server:http://[servername]:8080/ [workspacename];[domainname][utcode]

You will need workspace deletion permission to execute the above command.

More information here!

Happy programming!

Events and User group meetings

Singapore Dot Net user group meeting

February UG meeting is held on 4th February 2010, at Level 22 CF12, NTUC Building from 1845 – late. Light Refreshments will be served before the session.

Software Testing in Visual Studio 2010 and Team Foundation Server 2010
Maung Maung Phyo. MVP (Security)
1900 – late

Learn about how Team Lab Management in TFS 2010 enables better collaboration between software developers and testers. We will show you how testers can use Test and Lab Manager to plan tests, associate requirements, execute test cases and reports bugs with invaluable information ensuring that the developers are able to fix the reported bugs.

MS Tech Events

This event gives you a valuable opportunity to learn more about how Microsoft Application Life-Cycle Management (ALM) solution can help you reduce cost, reduce development time, increase quality, and improve project manageability.

Silverlight 4 in a Social Setting @Hackerspace – 10 Feb 2010

Our local MVPs (Justin and Michael) are organizing a social gathering on 10 Feb 2010 from 7pm to 10pm @HackerspaceSG (70A Bussorah Street). See Silverlight 4.0 in action and discuss or brainstorm Silverlight 4 projects.

This is an informal social gathering to amass like minded developers.

Michael Sync (Silverlight MVP) will be sharing with you "What’s New in Silverlight 4.0" in 15 minutes.
Followed by a roundtable session to know everyone. We will also share projects/ideas that developers are working on.

All are welcomed to just show up!! No registrations required!

Actually we were looking for a course like this: Visual Studio Team System Hands-On-Lab. But unfortunately we missed it. Now I have subscribed for the feeds; I will never miss any course.

If you guys have any info regarding this kind of social-technical gatherings or hands-on-lab in Singapore, please share with me.

Using INT or GUID for Primary Keys? Or COMB GUID?

Today I read an interesting article by Jimmy Nilsson (the author of .Net Enterprise Design) about using auto-generated primary keys in SQL Server. Especially for Surrogate Keys.

Of course, most of our decisions would be to use INT (or BIGINT) as data type and set IDENTITY to true. There is another solution: using GUID. [NEWID in SQL Server].

When a comparison is done between using INT and GUID as primary key, we can list out few important differences:

Pros of using GUID

  1. GUID will be unique through out the database; while INT is not
  2. @@IDENTITY can be a problem while doing INSERT through TRIGGERS. And using @@IDENTITY to get the identity for the recently added row in the table brings the created identity for the current connection.[This could be solved using SCOPE_IDENTITY (see here for an excellent explanation on this!)]
  3. GUID can be used to create primary keys before inserting the value to the database
  4. Integers are not big enough for most of the scenarios; even though you could start from smallest negative value. Anyway BIGINT can be a solution here.
  5. Using INT could be a real night mare when doing manual merge of tables.

Cons of using GUID

  1. 1. In theory GUID presents a risk of getting duplicate GUIDs; but in practice it is not. In the recent windows version (from windows 2000) the algorithm to generate GUIDs doesn’t use the MAC address of the network cars. Instead it is just a random value.
  2. GUID is 4 times larger in size.
  3. There are portability problems, not all the database have GUID as a data type.
  4. Huge overhead; 4 times larger than INT.
    Evolution of COMB GUID

So apparently overhead is the main problem here. When Jimmy Nilsson did real  inserts using GUID as primary key he found out that time taken for inserts with GUID is 30 times greater than inserts with INT. That happened for 2 reasons

1. The generated GUIDs lacked the order; they were random. As you see in the below table only the 4 is common.

C87FC84A-EE47-47EE-842C-29E969AC5131
2A734AE4-E0EF-4D77-9F84-51A8365AC5A0
70E2E8DE-500E-4630-B3CB-166131D35C21
15ED815C-921C-4011-8667-7158982951EA
56B3BF2D-BDB0-4AFE-A26B-C8F59C4B5103

2. SQL insert statement makes the indexes to be reordered and this takes a lot of time. The new ordering for indexes depended on the last bytes

Therefore Jimmy Nilsson arrived at a new algorithm and his new GUID is called as COMB GUIDs.

Jeffery Palermo in his book ASP.NET MVC in Action uses this COMB GUID for his sample code to explain NHibernate. I am using this COMB GUID for my pet project.

Other Resources:

  1. GUIDs as PRIMARY KEYs and/or the clustering key
  2. NHibernate POID Generators revealed : COMB GUID does not cause database fragmentation
  3. INT vs Unique-Identifier for ID field in database
  4. GUIDs are one of the leading causes of index fragmentation
  5. Top tips for effective database managements