Thursday 20 December 2012

Working with Team Foundation Service

I'm up and running now with Team Foundation Server Service, and I can say that I'm quite impressed. http://tfs.visualstudio.com/en-us/ I've set up a continuous integration environment at https://mackolicious.visualstudio.com/ Where I've already added two projects
There's a lot of basic functionality such as, backlog/user story generation, typical agile 'swim lanes' (to-do, in-progress, done), burndown, code/check-in history etc. My overview (dashboard) looks like this:
As you can see by adding in the capacity of the team and the sprint's dates, TFS is able to calculate burndown metrics, cool! I'm still playing around with the piece of kit! I've already added an application using Eclipse and one using Visual Studio and so far so good.

Sunday 25 November 2012

Release Management and Release Processes

In my experience as a developer I have observed various release processes that are used to get code to production.

In my opinion the most effective and efficient release process within an Agile environment is one that is automated and is controlled by either a QA (Quality Assurance) Tester or a PO (Product Owner). Both these individuals have a strong understand of the acceptance criteria and customer requirements, hence they would be best suited to deploy a piece of code to production.

I believe the entire process should be automated including rollbacks and configuration changes and should be as seamless as just clicking a button.

The steps should be as follows:
1. Developer finishes work and deploys to a test environment
2. QA signs off work according to COA (conditions of acceptance) and deploys to an intermediate environment, e.g. hidden live, stage (whatever)
3. The QA and PO then review the work again on the intermediate environment and deploy directly to production


That entire process could take less than an hour! Meaning work developed at the start of the day could be in production by the end :)

Friday 28 September 2012

System.IO.FileNotFoundException: Could not load file or assembly 'Missing.Assembly.dll, Version=1.0.20.15800, Culture=neutral, PublicKeyToken=null' or one of its dependencies. The system cannot find the file specified

Have you ever seen a error like the one below: System.IO.FileNotFoundException: Could not load file or assembly 'Missing.Assembly.dll, Version=1.0.20.15800, Culture=neutral, PublicKeyToken=null' or one of its dependencies. The system cannot find the file specified The first thing to think is Ahhhhhhh! I have all the references what is going on??! Don't panic here's a few steps to diagnosing this issue, and hopefully resolving it. STEP 1. Make sure you can debug the code and ascertain which assembly is throwing the exception. STEP 2. Once you know which assembly is complaining about the missing DLL you then need to find the corresponding project and check that project's references:
If the reference that is in the exception exists then move on to STEP 3. STEP 3. If the reference exists then you need to look a little deeper, into the second part of the exception or one of its dependencies. This is key, what this means is that Missing.Assembly.dll is referencing an assembly you're not (FACT). What you need to do is ascertain exactly which assembly it is referencing and which version it is referencing. This is actually quite tricky for large projects but the easiest way to find out is by looking at where the exception occured (i.e. the line of code) and what that particular piece of code needs to work in terms of references. Do this by looking at the using statements:
Good luck :) Any questions?

Monday 27 August 2012

Upgrading to Windows 8

Bonjour Amigos, Just upgraded to Windows 8 Pro on the following machine: (Just look at the spec)
It was a very seamless and easy installing, however my version of Kaspersky had to be un-installed as it was not compatible with Windows 8, apart from that everything else worked fine. At first it was difficult to use as navigating around the desktop (app) without a start icon is confusing...
Whenever you press the start key you are redirected to the new Windows start screen! It can be quite confusing at first but intuitive minds can get used to it quickly. For example, if you wanted to search for an application or file, you carry out the same action you would on Windows 7 START + "app name", however it looks completely different:
This concludes my analysis for upgrading to Windows 8, just to summarise I would recommend upgrading!

Monday 2 July 2012

Using ADO.NET to connect to a custom DB provider

Using ADO.NET is a great way to connect to data provider that exists outside the .NET framework to connect to a database. There are many data providers out there, MySQL,PostgreSQL, FlySpeed etc. which are in commercial use but are not neccesaraly that popular. It can be difficult sometimes create a DAL for a custom database architecture. Fortunately we have ADO.NET along with DbProviderFactory class which allows any custom DB provider to connect to the .NET CLR and allow developers to write custom execution queries against the database. When using these custom DB providers you need to update your application configuration file so that the .NET runtime has knowledge of the DbProviderFactory that you intend on using, if you check your machine.config (C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\Config) for .NET v4.0 you'll should see the following entries:

<section name="system.data" type="System.Data.Common.DbProviderFactoriesConfigurationHandler, System.Data, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" />



<DbProviderFactories>       
     <add name="Microsoft SQL Server Compact Data Provider" invariant="System.Data.SqlServerCe.3.5" description=".NET Framework Data Provider for Microsoft SQL Server Compact" type="System.Data.SqlServerCe.SqlCeProviderFactory, System.Data.SqlServerCe, Version=3.5.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91" />
         
    <add name="Microsoft SQL Server Compact Data Provider 4.0" invariant="System.Data.SqlServerCe.4.0" description=".NET Framework Data Provider for Microsoft SQL Server Compact" type="System.Data.SqlServerCe.SqlCeProviderFactory, System.Data.SqlServerCe, Version=4.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91" />   
    
    <add name="MySQL Data Provider" invariant="MySql.Data.MySqlClient" description=".Net Framework Data Provider for MySQL" type="MySql.Data.MySqlClient.MySqlClientFactory, MySql.Data, Version=6.3.6.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d" />     
</DbProviderFactories> 

These are the default factories that cone with the .NET framework, however if you want to introduce your own custom factory you can just add an entry. Depending on whether you want your factory to be available across applications or not you could add the entry to your local application/web configuration file.

Wednesday 9 May 2012

JSONP aka JSON with PADDING

JSONP is a hack pattern which allows JavaScript from one domain to execute with JavaScript from another domain. This is technically not allowed as it violates cross-domain polcy, howeber throught the JSO work-aroud it can be acieved. "Why?" I hear you ask, well the answer is simple, someone exposes a service on one machine and you want to consume that service on another. "But that's already possible!" you cry, well yes however you can't execute that service inline, look at the following example: On my.domain.com you have a local script called myscript.js, so the full URL to that script is my.domain.com/myscript.js. Inside this script you want to make a call to other.domain2.com/their.js and execute their JavaScript, it doesn't work, unless you explicitly ue the <script> tags. But that's pointless because you want to execute their inline with yours; that's where JSONP comes in. Using JSONP you are able to overcome this hurdle by setting up an 'understanding' between the service and the requesting JavaScript. This can be achieved by placing a query string parameter in the URL that the service understands, e.g. other.domain.com/their.js?jsonp=yes. The service will 'wrap' it's response in a JavaScript function, which the requesting JavaScript will execute once it's received the request. Once it executes this request, it will hopefully get some meaningful JSON that it can interpret and use for it's own devices.

Tuesday 10 April 2012

ASP.NET Web API (Beta)

I have been recently looking into the new ASP.NET Web API to find out what features it offers and it's quite interesting. It seems to following a convention over code model similar to that of ASP.NET MVC, so for those familiar with those constructs it should be an easy API to follow. You use the global.asax.cs file to name your routes (just like ASP.NET MVC), however you use the MapHttpRoute overload instead e.g. routes.MapHttpRoute( name: "DefaultApi", routeTemplate: "{controller}/notifications" ); This is a typical route where the {controller} template representes the controller name that is found beneath the controllers folder.

Wednesday 11 January 2012

Using Modinizer

Modinizer is a great new javascript library for detecting browser features without having to write too much unnecessary code for abstracting cross-browser compatibilities. For example rounded corners is a feature that was desired for many UX (user experience) developers and before CSS3 could only be accomplished with javascript hackery. Now with modinizer it is possible to detect using the library whether the feature is enabled and code corresponding classes appropriately. When modinizer is included on a page it dynamically updates the HTML class attribute with a set of classes that identify what that browser understands, e.g. multiplebgs = multiple backgrounds, or no-multiplebgs meaning doesn't support multiple backgrounds. Allowing you the developer to code accordingly for both scenarios and thus future proof your web application. The same also applies to new HTML5 elements such as video, and localstorage. A quick Modiziner.localStorage test will reveal whether or not the browser supports that.

Tuesday 10 January 2012

Reading Files Across Servers

There are many ways to access data across servers that are shared on the same network. Depending on your setup will determine what is the best approach. It is possible to use the COM model to access the UNC as and authenticated user of that machine, i.e. machine name = CPU123, user = CPU123\user.whoever. The problem with this is approach is that if your servers are on different domains then is will be a huge headache to maintain all of those users, configuration file sounds like the best approach, but that will still yield a huge file. One app setting for each user. The approach I adopted was to use MSMQ and created a WCF service for sending messages to that queue and reading the state of the particular file using a FileWatcher. Worked well!