Tuesday, August 20, 2013

Are they missing the value of the process?

The 'Process'

Working in custom application development and consulting I have found, quite often, that clients tend to miss the value of the process.

It seems that success, of the project, all hinges on that initial momentum and it's often hard to get moving.

Usually when a client comes with an idea they feel that they "have it down." However, when you come to them with questions that make them say "well I hadn't thought of that" you get that momentum started. This is, of course, followed by "How much are we talking here?". The client will want to know the cost, right then, right there.

I grew up with a father who is a contractor, building houses, adding additions, etc... I grew up with the idea of 'you can't build it without a plan.' However when it comes to a digital structure, like an application, clients often struggle to see the value of putting together a 'blue-print'. Often the feeling is 'this is a simple idea/project' but more often then not there are many layers below it.

I've come to think of it like a cherry...I know, bare with me... Clients often come with their ideas, they know, or they think they know, exactly what the project needs to be done. This is the cherry's pit. However, almost always, when it comes to the discovery/planning meeting you start to uncover the meat of the project and find out that there is a lot more to it and, in fact, this idea could really be part of a bunch of projects that need to be done.

A friend and colleague of mine has another great post on his point of view in regards to this same subject and he describes it perfectly. His point of view is that the client come to you with a book review and says 'Here's the review...now write the book!'

Friday, August 16, 2013


A quick post about some updates I was working on today. I will not be posting any snippets with this post, it will be more of a challenge to think of how you could refactor your own code to get away from repeated code.

I have a few SSRS reports that use a lot of the same temporary tables and variables but they are simply copied and pasted through the SQL code.

This. Drives. Me. Nuts.

I feel almost guaranteed to forget to update some variable somewhere if it exists multiple places so today I decided to take some study time and refactor this code to get away from this bad practice and I feel it came out very successful.

To make this work I used a combination of stored procedures and function calls.

I have four reports that I really wanted to update and they are in groups of twos. They provide data on different areas but one of the sets provides a week view and the other a month view. The four reports have temporary tables that share a similar structure with different offshoots of columns.

To improve this I made a function that built and returned the base temporary tables and then other respective functions that called the first function, added the other needed columns for each report, and returned the table to the calling stored procedure that then fills the table with the data.

The other update ended up saving me quite a few extra lines of code. This function returned a table populated with data regarding time billable but I was able to also add a bit parameter to the function to specify if I wanted to return only billed time or non-billed time and then add a SQL IF statement off the bit to return the needed data. This allowed me to use the same function call to get whichever data needed in multiple places.

If you have any tips and tricks on SQL code reuse I'd love to hear them. I find that SQL code reuse can be a bit tricky so any tips are well appreciated by not only me but the community as well, I'm sure.

Monday, August 12, 2013

Windows Service Debug Installation

This morning I was faced with the issue of debugging a Windows Service installation. It was driving me crazy, as I did not know a easy way to run the installation through Visual Studio debugger.

After reading through the logs of the installation I could see that there was an exception being thrown in the BeforeInstallation() override method but the rest of the information was just vague enough to not be able to quickly find the answer.

After asking my friend Google if he had ever heard of this problem I was directed to a StackOverflow post that had a similar issue. Poster Klaus had a suggestion that was all I needed. He suggested posting the below code in the area you want to debug:


At this point you run CMD prompt, as an administrator, and install the service. The install will break but then you'll be asked to continue in debug mode and, viola, you're in.

My problem turned out to be a null reference in the context, which was easily fixed thanks to the help of debugger and Klaus. Thanks Klaus!

Stack Overflow post ref.

Wednesday, August 7, 2013

Entity Framework Constraint Woes

This morning I was working on updating a few models in an ASP.NET MVC4 app and then needed to update the database with these changes when I was presented with an error that threw me for a little loop. In my Entity Framework migration scaffold the code looked like:

DropPrimaryKey("dbo.{{TableName}}", new[] { "Id" });

This code resulted in the following error:

'PK_dbo.{{ModelName}}' is not a constraint.
Could not drop constraint. See previous errors.

As I was expecting that Entity Framework has set up the migration with the constraint name being 'PK_dbo.{{ModelName}}' it took me a few moments to check the constraint name in SQL server to find that the naming of that constraint was an older naming from previous model updates that had not been renamed by EF.

Luckily this is an easy fix. The DropPrimaryKey() method has an overload that allows you to specify the name on the constraint and viola! problem solved!

DropPrimaryKey("dbo.{{TableName}}", "PK_dbo.{{KEYNAME}}", new[] { "Id" });

Happy coding!

Monday, May 13, 2013

Azure Network Load Balancer Time-out Errors

Another Azure enigma...

I've been working on a web application where we process mail merges and return .docx files via OpenXML. The system works great locally and remotely on Azure, if the mail merge batch is relatively small.

Here's where the problem becomes obvious; some of the batches have the potential to parse through thousands of records and returning documents with thousands of pages, this causes the document return to take several minutes.

Azure, however, employs an NLB to make sure that there aren't any idle connections that are wasting resources. Makes sense, right, otherwise how are they going to keep the pipes clear for the traffic to your site.

There is a fairly simple solution to getting around this, though some may see it as a bit of a hack. We created a custom status object, to return to an $.ajax() call. This object returns the status of the mail merge and if any errors have been encountered.

After our call to the method for the document was updated we added an <iFrame> element to the site and via jQuery update the .attr("src","{server.url.for.doc}). This allows the ability to simulate the post and download the file.

Known Issues:

  • Requires GET - and removes POST for document, making the call less secure (IMO)

Saturday, May 11, 2013

"No Such Host is Known..." - Windows Azure

Today, while working on a hobby project hosted on Windows Azure, I ran into one of those nagging problems that does not easily explain itself. My few updates to a project that had grown dusty, a web publish, presented me with the dreaded YSoD proclaiming "No Such Host Is Known."

This was a new error for me but the stack trace lead me to believe that it had to do with the database.

[Win32Exception (0x80004005): No such host is known]

[SqlException (0x80131904): A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - No such host is known.)]
System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction) +5296071
System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose) +558
System.Data.SqlClient.TdsParser.Connect(ServerInfo serverInfo, SqlInternalConnectionTds connHandler, Boolean ignoreSniOpenTimeout, Int64 timerExpire, Boolean encrypt, Boolean trustServerCert, Boolean integratedSecurity, Boolean withFailover) +5308555

Maybe it was Azure. Nope, everything was indeed running as it should be there. Maybe my connection string in the project. Still no, Web.config looked good. I'll try doing a EF "Update-Database" via the Package Manager Console. Sure enough this was working too.

So what was the cause of this problem?

Well it turns out my hunch about the Web.config connection string was pretty darn close. Azure provides a way to obfuscate your connection string details by overriding the connection string with details provided in the Azure site configuration settings.

Via the Azure Portal, go to your sites "Configure" tab and you can view the "Connection String" overrides.

It was their connection string override that was causing the problem.

Well, okay, it was mostly their fault.

My issue was that my connection string name in Azure did not match the one in my system. Making the connection string name equal to the one in my system, a quick save, and back in business.

My online searches did not have much to say on fixing this issue for a site hosted in Azure but hopefully this can help you with troubleshooting your issue. Good luck and happy coding.

Obligatory First Post

I use this blog to help help ensure that I don't forget some of the great content that I come across while troubleshooting or browsing the web. Hopefully you'll find these entries of value. Feel free to contribute via the comments.