AllPaul

programming, tech, hobbies and grief

7. April 2014 06:42
by Paul Apostolos
0 Comments

Create a simple MVC application to change SQL Azure firewall rules

7. April 2014 06:42 by Paul Apostolos | 0 Comments

I recently needed a way to allow one of our freelance developers to periodically change a SQL Azure firewall rule to cope with his dynamic IP address. The developer requested administrator access to our Azure portal, but that seemed a bit too much. With that access the developer would have full rights to the portal and that was, from a security standpoint, not an option. 

I decided there had to be a better way to allow him to change a particular firewall rule without sacrificing security.

After a bit of Web searching, I came across a great article series by Brady Gaster. In the series, he walks through the new Windows Azure Management Libraries for .NET (WAML). Of particular note to me was Managing Windows Azure SQL Databases Using the Management Libraries for .NET. Using a few nuggets from this article, I was able to create a simple MVC application to allow the developer to make this change without granting privileges to the Azure portal.

The application

The image below shows a screenshot of the application. It displays the current value of the firewall rule to the user and provides a textbox (prepopulated with the users current IP address) to allow them to save a new value. Note: the values in the screenshot are just placeholders.

For this application, I kept it basic...File->New Project->ASP.NET Web Application

After I tweaked a few style settings I was ready to move on to the WAML goodness.

To use the WAML, I needed to first create a management certificate, upload it to the Azure portal, then export a copy for use in my application. For detailed instructions see the following articles.

Create and Upload a Management Certificate for Windows Azure

Creating a Personal Information Exchange certificate

Once the management certificate was created, uploaded and exported, I was ready to get started with the code.

The code

All of the code for this application is in the HomeController.cs file. The most interesting is the createSqlManagementClient() method. This method uses the certificate .pfx file I created above to create a new X509 certificate. Then, I used that certificate and the Azure subscription ID to create a new SqlManagementClient. With this client I can manage all sorts of SQL Azure stuff including, for my purposes, the firewall settings.

One other item worth noting is the SqlManagementClient.FirewallRules.Update method. The method accepts three arguments: SQL server instance name, firewall rule name, and a FirewallRuleUpdateParameters object.  This object, I named parameters, has a few properties that need to be set:

  1. StartIPAddress - The start of the range of IP addresses to which the firewall rule applies
  2. EndIPAddress - The end of the range of IP addresses to which the firewall rule applies 
  3. Name - The firewall rule name (I know we already supplied that in the update method parameters, but this is to change the name if desired)

*Note: Specifying a StartIPAddress and EndIPAddress of 0.0.0.0 will effectively allow all IP addresses so you may want to add some validation.

Below is the code for the HomeController.cs file.

public ActionResult Index()
{
	var sqlManagementClient = createSqlManagementClient();
	return View(GetSqlFirewallRuleList().Where(f => f.Name == ConfigurationManager.AppSettings["FirewallRuleName"]).SingleOrDefault());
}

[HttpPost]
public ActionResult Index(string IPAddress)
{
	var sqlManagementClient = createSqlManagementClient();
	FirewallRuleUpdateParameters parameters = new FirewallRuleUpdateParameters();
	parameters.StartIPAddress = IPAddress;
	parameters.EndIPAddress = IPAddress;
	parameters.Name = ConfigurationManager.AppSettings["FirewallRuleName"];
	sqlManagementClient.FirewallRules.Update(ConfigurationManager.AppSettings["SQLServerName"], ConfigurationManager.AppSettings["FirewallRuleName"], parameters);
	return View(GetSqlFirewallRuleList().Where(f => f.Name == ConfigurationManager.AppSettings["FirewallRuleName"]).SingleOrDefault());
}

private SqlManagementClient createSqlManagementClient()
{
	//To use this, create and upload your own management certificate for Azure.  Then export the pfx file from Certificates MMC with the private key.
	//Save that file in the root directory called azure-mgt-cert.pfx and add the password to the web.config file
	//The file in this solution is empty and just a placeholder
	var cert = new X509Certificate2(Server.MapPath("/azure-mgt-cert.pfx"), ConfigurationManager.AppSettings["CertificatePassword"], X509KeyStorageFlags.MachineKeySet);
	SqlManagementClient sqlManagementClient = new SqlManagementClient(new CertificateCloudCredentials(ConfigurationManager.AppSettings["SubscriptionID"], cert));
	return sqlManagementClient;
}

private IList GetSqlFirewallRuleList()
{
	var sqlManagementClient = createSqlManagementClient();
	var firewallRuleList = sqlManagementClient.FirewallRules.List(ConfigurationManager.AppSettings["SQLServerName"]);
	return firewallRuleList.FirewallRules;            
}

All that is left to do is add the code for the view (/Views/Home/Index.cshtml) and publish.

Wrapping up

There it is. An easy way to allow non-administrators to configure a SQL Azure firewall rule without granting permission to the portal. 

If you want to look at the entire solution, I created a GitHub repository of the project so feel free to get it and play with it. I removed the confidential information regarding my subscription and certificate from the solution, so you will need to add your own to get it working.

Good luck!

31. May 2013 16:30
by Paul Apostolos
0 Comments

I'm rebooting my career

31. May 2013 16:30 by Paul Apostolos | 0 Comments

When I started working at the National Roofing Contractors Association (NRCA) nearly 18 years ago, I was relieved any company would hire a college dropout with an admittedly sketchy backstory. So, their offer of the position of Customer Relations Assistant (answer phone calls and ship customer orders) was just fine with me, little did I know what a great opportunity it would be.

The NRCA years

I am grateful to NRCA for the many opportunities I was given over my employment including: promotions, increased responsibility, education assistance and flexibility.  In my 18 years, I moved through the ranks and held a long list of titles:

  • Customer relations assistant
  • Risk management coordinator
  • Customer relations assistant manager
  • Customer relations manager
  • Web site manager
  • Director of Internet Development
  • Associate Executive Director of Information Technology / Chief Information Officer

Not only did I move through the ranks, I moved through the office too. I moved desks or offices 10 times finally ending up in (seemingly) the coldest but cleanest office in the building. But not only did NRCA provide me with opportunities to grow within the company, it also provided me with the thing I am most grateful for, my education.

I decided eleven years ago that I needed to do something about that college dropout moniker I was carrying around. I went back to school, full time, and NRCA allowed me to have a flexible work schedule and also helped with my tuition bills. I graduated from DePaul with a B.S. in computer science in 2005 and then graduated from Northwestern with a M.S. in computer information systems in 2007. Now when I yell at my kids about going to college they won't be able to respond, "you didn't even finish"...Instead, I can say, "I went to school full time, worked full time and had a family. It was hard so, you should go while you don't have those demands"...Boom!

Being a freelancer

Some time around 2002 I started doing freelance work here and there for various clients. Mostly small projects in the beginning but the requests began to grow in both size and volume. For the last few years it seems like I have done nothing but work. Every day, every night, every weekend and every "day off" have been spent working and I finally reached the tipping point. I decided about a year ago I needed to come up with a better way to manage my work/life balance. 

The real problem I was having was there were only so many hours I could work. I needed help but I was too afraid to hire someone for fear of having to let them go if the work dried up. I was also reluctant to leave NRCA because of loyalty and, not to mention, the benefits; with three kids, health insurance is fairly important.  Thankfully, my wife decided to go back to work full time, and in March she took a job that came with benefits, namely health insurance. That whole change has been interesting, but overall better for everyone, I think.

Starting my own company

I created a corporation two years ago in anticipation that this day would eventually be a reality. The company, Stack Solutions (www.stack-solutions.com), provides IT consulting services including custom software, website and mobile development. In addition, Stack manages IT infrastructure for small to medium businesses.

June 1, 2013 Stack Solutions will go from my side project to my full-time endeavor. Initially there will be a total of three employees with plans to add a project manager within the first month. 

I'm excited, nervous and optimistic. Thanks to all of Stack's clients, the future of the company looks promising but I couldn't have done any of it without (Stack's newest client) NRCA.

20. April 2013 07:05
by Paul Apostolos
0 Comments

Use a Custom DisplayModeProvider to Serve Crawler Friendly Content Pages

20. April 2013 07:05 by Paul Apostolos | 0 Comments

I have been working on a large website redesign. The site relies heavily on client-side rendering of html elements and leverages ajax to limit server postbacks.  For example, to display a list or grid of items, I am using the Kendo UI Web controls ListView or Grid. These controls don't render search-engine friendly content however. Thankfully, with just a few lines of code, I can create custom views for search engines that render the content in a more indexable way. But, that means I give up the benefits of these controls; namely the client side rendering, sorting and paging.

DisplayModeProviders to the rescue

In ASP.NET MVC 4 there is a new DisplayModeProvider class that allows application developers to tap into the pipeline of MVC view selection. It changes the selection process of the view and can be configured to automatically serve different views to mobile browsers by just changing the filename of a Razor view from (for example) Index.cshtml to Index.Mobile.cshtml. And, because the view is (or at least should be) responsible for rendering the output to the client, changing the view shouldn't require any changes to the underlying business logic. Or, more simply, to support alternative layouts of the same page, a developer could simply add a new view for that new layout without affecting the application logic.

Out of the box, MVC 4 supports "Mobile" as a display mode, but additional DisplayModes can easily be added to applications in the global.asax file. So I decided to add a simple DisplayMode to detect if the current request was from a search engine indexer and serve a custom view with all the ajax and client-side rendering removed.

To add my new DisplayMode, I added the following code to the Application_Start method of the global.asax file:

DisplayModeProvider.Instance.Modes.Insert(0, new DefaultDisplayMode("Crawler")
{
       ContextCondition = (context => Utils.IsCrawler(context.Request))
});

And in my Utils class I have a static method to detect whether the request is a search engine indexer.

public static bool IsCrawler(HttpRequestBase request)
{
    bool isCrawler = request.Browser.Crawler;
            
    if (!isCrawler)
    {
        Regex regEx = new Regex(ConfigurationManager.AppSettings["CrawlerMatches"]);
        isCrawler = regEx.Match(request.UserAgent).Success;
    }
    return isCrawler;
}

This method is all over the Internet; the only interesting part of it is the Regex pattern is stored in the web.config so it can be easily modified.

So, what happens?

When a request comes into the site, the request is evaluated by the DisplayModeProvider and if it satisfies the condition of being a search engine indexer (evaluated by the method Utils.IsCrawler) then the view selection will first look for a view with the same name as the normally returned view but with a ".Crawler.cshtml" extension instead of just ".cshtml".

And for the requests that would normally include client-side rendering, I just add an additional version of the view with the ".Crawler.cshtml" extension and that page will get served automatically to search engine indexers.

Wrapping up 

Developing site that relies heavily on client-side content rendering can be detrimental to search engine visibility. In the past I would have just created all the rendering on the server and sacrificed the interaction speed of having script render the contents client side. With DisplayModeProviders, I'm able to supplement the client-side rendering view with a view that presents a more indexable interface.

It should be noted here, that I am not trying to trick the search engine spider at all. The page the spider will see is exactly the same as a person would see. I am merely changing the interaction of the navigation to be server side instead of client side.

Give it a try and let me know what you find.

Good luck to you!