Damn, can someone be this ignorant?

Friday, June 28, 2013

There is nothing technical in this post that i am going to talk about, but i cannot resist to post this experience of mine which i had at my work.

Currently, i am working in utilities industry and there is an application which end users can use to view their electricity usage. Nothing much fancy, just you can login, views your daily, weekly, monthly or yearly graphs; or you can download last 2 years of data as a csv.

I received an issue from a customer saying that when he downloads his meter data from the portal, he can only see records for the year 2011. Well, that's an issue because he should have been able to view the records ranging from today and dating back to past 2 years from current date.

So, i started investigating it to get to the bottom of it. I checked the views in the back-end database and there was all the data for him. I even checked the logs, everything looked great. I talked to various teams expecting someone would know what could be wrong with this.

In the end, i had to give up and i asked customer to try to download the data once again, hoping that he will be able to see it everything fine. But, bad luck, as the user still complained of seeing data only for the year 2011. I asked the user to send me the file, so that i can have a look it.

And there was the Eureka moment. I finally cracked it and advised user the reason he was not able to view records for the year 2013. The reason i gave him was very simple - "Please scroll down using the vertical scroll-bar and you will be able to see all the records".

The data was always there, but he couldn't care to scroll down till the bottom.

Damn!

Moving database connection string to azure service configuration (cscfg)

Wednesday, October 10, 2012


While working with ASP.NET web sites/projects we normally keep our database connection string in Web.config. However, while working in Azure, it's a good idea to keep this configuration in the service configuration itself as it will be easier to change the connection string once you have deployed your azure service and this will prevent the need of redeployment.

Also, while working with Azure services it would be easier to keep database configuration in the service configuration file for the simple fact that you will not need to keep on changing the database server name or credentials whiles working locally and while deploying to azure.

Because, we can have different service configuration files for different environments such as local, cloud or even for test or staging, we can simply add a key in the service definition file (csdef) and have values for each environment in each of the service configuration file (cscfg).

This is much more helpful in cases where you are using membership provider for forms authentication, authorization and/or session management, such as using ASP.NET Universal Membership provider for SQL Azure as it requires you to have the connection string in the web.config file, by default it is named DefaultConnection.

To achieve this you can remove the section from web.config and add key for your database connection in your csdef file and values in cscfg. Then it is simply to add the section at run time to your web.config in the Application_Start event, whilst reading the values from the cscfg file.

So, first add a key in your csdef file and its setting in the cscfg file as below, for example:

<Setting name="MyApplicationDB" value="Data Source=.\sqlexpress;Initial Catalog=Customers;Integrated Security=False;User ID=App.Web;Password=%#$##@;MultipleActiveResultSets=True;">


Then in your Global.asax.cs file's Application_Start event include the following code to add the required section in web.config

string connectionString = RoleEnvironment.GetConfigurationSettingValue(“MyApplicationDB”);
            // Obtain the RuntimeConfig type. and instance
            Type runtimeConfig = Type.GetType("System.Web.Configuration.RuntimeConfig, System.Web, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a");
            var runtimeConfigInstance = runtimeConfig.GetMethod("GetAppConfig", BindingFlags.NonPublic | BindingFlags.Static).Invoke(null, null);

            var connectionStringSection = runtimeConfig.GetProperty("ConnectionStrings", BindingFlags.NonPublic | BindingFlags.Instance).GetValue(runtimeConfigInstance, null);
            var connectionStrings = connectionStringSection.GetType().GetProperty("ConnectionStrings", BindingFlags.Public | BindingFlags.Instance).GetValue(connectionStringSection, null);
            typeof(ConfigurationElementCollection).GetField("bReadOnly", BindingFlags.NonPublic | BindingFlags.Instance).SetValue(connectionStrings, false);
            // Set the SqlConnectionString property.
            ((ConnectionStringsSection)connectionStringSection).ConnectionStrings.Add(new ConnectionStringSettings("DefaultConnection", connectionString));


An ASP.NET Picasa Image Gallery

Monday, October 1, 2012

Few days back I was thinking of creating an Image Gallery of the collection of photos I have. Although, there are multiple options available over the internet that you can download and get ready on the go; most of them involves saving the images on your own server. But what I was more concerned was to just have a display only image in my web site to showcase my photos to the word. And, I wanted to make use of one of my social network account – Facebook, Google+, Twitter or Flickr to host the images.

All the major social network sites provide API to get the photos from an album. First, I tried to make use of Facebook, but it’s access token gets expired in few hours and you need to get a new token. This normally involves a user to login to your application first, but I wanted a permanent access token or a way to refresh the access token without user’s intervention. I think it is possible to achieve it somehow, I tried my hand on Picasa (Google+) and it was much easier.

Using Picasa Web Albums Data API  , you can query any public album and get the photos. You need two things for this – Album Id and User Name. In case you need to show photos from one of your private albums, you need to authenticate yourself using the API with your Google account credentials.

Let’s go step by step and see how to achieve this. To achieve this I am using GoogleData API for .NET  and Galleriffic  jquery  plugin for the image library.

Step 1: Design your gallery.
First step is to create your image gallery base. I am using the Galleriffic jquery plugin and for this there should be few style sheet files (css) and javascript files need to be included in the project.

First, go to this link http://www.twospy.com/galleriffic/ and download the plugin from there. You should copy the css and js folders into your ASP.Net website/application project. Above link showcases 5 examples and I am using the second one (Thumbnail rollover effects and slideshow crossfades), as it is closest to the image gallery looks and feel I wanted.

CSS file used for example 2 is galleriffic-2.css. You can exclude other css files named such as galleriffic-1.css, galleriffic-3.css etc, but you will need other remaining css and image files.

After you are done with including css and js folders, next step is to include them in you page. In this example, I am using Default.aspx to be my image gallery page, but it can be anything for you such as Gallery.aspx. In the head tag include the following tags:

    <link rel="stylesheet" href="css/basic.css" type="text/css" />
    <link rel="stylesheet" href="css/galleriffic-2.css" type="text/css" />
    <script src="//ajax.googleapis.com/ajax/libs/jquery/1.8.0/jquery.min.js" type="text/javascript"></script>
    <script type="text/javascript" src="js/jquery.galleriffic.js"></script>
    <script type="text/javascript" src="js/jquery.opacityrollover.js"></script>
   
    <script type="text/javascript">
        document.write('');
    </script>

Next, include the following script in your page:

    <script type="text/javascript">
        jQuery(document).ready(function ($) {
            // We only want these styles applied when javascript is enabled
            $('div.navigation').css({ 'width': '300px', 'float': 'left' });
            $('div.content').css('display', 'block');

            // Initially set opacity on thumbs and add
            // additional styling for hover effect on thumbs
            var onMouseOutOpacity = 0.67;
            $('#thumbs ul.thumbs li').opacityrollover({
                mouseOutOpacity: onMouseOutOpacity,
                mouseOverOpacity: 1.0,
                fadeSpeed: 'fast',
                exemptionSelector: '.selected'
            });

            // Initialize Advanced Galleriffic Gallery
            var gallery = $('#thumbs').galleriffic({
                delay: 2500,
                numThumbs: 15,
                preloadAhead: 10,
                enableTopPager: true,
                enableBottomPager: true,
                maxPagesToShow: 7,
                imageContainerSel: '#slideshow',
                controlsContainerSel: '#controls',
                captionContainerSel: '#caption',
                loadingContainerSel: '#loading',
                renderSSControls: true,
                renderNavControls: true,
                playLinkText: 'Play Slideshow',
                pauseLinkText: 'Pause Slideshow',
                prevLinkText: '‹ Previous Photo',
                nextLinkText: 'Next Photo ›',
                nextPageLinkText: 'Next ›',
                prevPageLinkText: '‹ Prev',
                enableHistory: false,
                autoStart: false,
                syncTransitions: true,
                defaultTransitionDuration: 900,
                onSlideChange: function (prevIndex, nextIndex) {
                    // 'this' refers to the gallery, which is an extension of $('#thumbs')
                    this.find('ul.thumbs').children()
                                                .eq(prevIndex).fadeTo('fast', onMouseOutOpacity).end()
                                                .eq(nextIndex).fadeTo('fast', 1.0);
                },
                onPageTransitionOut: function (callback) {
                    this.fadeTo('fast', 0.0, callback);
                },
                onPageTransitionIn: function () {
                    this.fadeTo('fast', 1.0);
                }
            });
        });
    </script>

Next step is to have necessary div tags as detailed in the documentation for Galleriffic (http://www.twospy.com/galleriffic/). The galleriffic plugin shows each image’s thumbnail as a list item in an HTML unordered list (ul tag). As we will be getting our images at run time only via call to Picasa API we need to have the li items generated from within code. Therefore, I have added a div (divSlider) and marked it to runat=”server” so that I can assign a value to its innerHTML with constructed html.

Within you form tag in your aspx include following:


 <div id="page">
            <div id="container">
                <h1>
                    <a href="#">My Website</a></h1>
                <h2>
                    Gallery</h2>
               
                <div id="gallery" class="content">
                    <div id="controls" class="controls">
                    </div>
                    <div class="slideshow-container">
                        <div id="loading" class="loader">
                        </div>
                        <div id="slideshow" class="slideshow">
                        </div>
                    </div>
                    <div id="caption" class="caption-container">
                    </div>
                </div>
                <div id="thumbs" class="navigation">
                    <div id="divSlider" runat="server">
                    </div>
                </div>
                <div style="clear: both;"></div>

            </div>
        </div>

Next, we will make a call to Picasa API, get the images within a specific album and construct an unordered list html and assign this html to divSlider.

Step 2: Query the API

Download and install Google Data API SDK for .NET.  Once installed add references to the following dlls in your ASP.NET project:

Google.GData.Client.dll
Google.GData.Photos.dll
Google.GData.Extensions.dll

After you have installed the SDK, the default location for these dlls will be C:\Program Files (x86)\Google\Google Data API SDK\Sample on 64 bit system and C:\Program Files\Google\Google Data API SDK\Samples on x86.

Next include the following namespaces in aspx.cs file:

using Google.GData.Photos;
using Google.GData.Client;
using Google.GData.Extensions;
using Google.GData.Extensions.Location;


There can be two scenarios you can use to display the images
1.       You can display any public album which can belong to you or any other user. For this you will need Google account username (yours or the username of person whose album you are accessing) and the album id.
2.       Accessing your own private album. Here you will need to provide your username, password and album id. You will need to authenticate with API using your credentials and then access the album.

For any of the above cases, it’s a good idea to have username and album id information in Web.config or in a Constant class.

  <appSettings>
    <add key="albumid" value="5792668263385651889"/>
    <add key="user" value="username@gmail.com"/>
    <add key="password" value=""/>
  </appSettings>


To get album id for the album that contains the photos you want to display, you can simply brose to the album and from the url in the address bar, you can get the album id. For example in case of https://plus.google.com/u/0/photos?tab=mq#photos/114107981519387242086/albums/5792668263385651889, the album id is 5792668263385651889.

Now, include the following in your Page_Load :

string userName = ConfigurationManager.AppSettings["user"];
            string password = ConfigurationManager.AppSettings["password"];

            PicasaService service = new PicasaService("freak.roach-sample");
            //service.setUserCredentials(userName, password);  //-- needed when you need to show albums with private visibility

            PhotoQuery query = new PhotoQuery(PicasaQuery.CreatePicasaUri(userName, ConfigurationManager.AppSettings["albumid"]));
            PicasaFeed feed = service.Query(query);

            StringBuilder html = new StringBuilder();

            html.Append("<ul class=\"thumbs noscript\">");

            foreach (PicasaEntry entry in feed.Entries)
            {
                string title = entry.Title.Text.Substring(0, entry.Title.Text.LastIndexOf("."));

                html.Append(String.Format("<li><a class=\"thumb\" name={0} href=\"{1}\" title=\"{2}\"><img src=\"{3}\" alt=\"{4}\"/></a>",
                    title, entry.Media.Content.Url, title, entry.Media.Thumbnails[0].Url, title));
                html.Append(String.Format("<div class=\"caption\"><div class=\"image-title\">{0}</div><div class=\"image-desc\">{1}</div></div></li>",
                    title, entry.Summary.Text));

            }

            html.Append("</ul>");
            divSlider.InnerHtml = html.ToString();    

That’s it, now when you run this project you will be able to view nice image gallery as below:



Next Steps
The above approach can be extended to show images from Twitter, Flickr, Facebook,  Photobucket etc as well.

Source Code
Source code is available at the MSDN Samples Gallery below and you can use it as is; it is ready to go after you change the configuration key values.

source code - http://code.msdn.microsoft.com/Picasa-Google-Image-bc8bc8d6

Display List of Uploaded Azure VM Role vhd images in a list

Thursday, May 19, 2011

I just posted a sample on MSDN for displaying the list of base vhd images uploaded to Azure Portal, for Virtual Machine (VM) Role.

You can get the list of uploaded vhds simply by running the following command in command prompt


csupload.exe Get-VMImage -Connection "SubscriptionId=xxxxxx-xxxxxx-xxxxxx-xxxxxx;CertificateThumbprint=xxxxxxxxxxxxxxxxxxx"

Thus making use of Get-VMImage switch of csupload you can get the complete details but the output is not presentable. It looks like something as below:

csupload get-vmimage output

To make this presentable we just simple call this process form our c# code, parse the output, save properties to a list of object and bind it to a gridview, to make it look like as follow:

csupload get-vmimage formatted output


Follow this link for details on how to achieve the same: http://code.msdn.microsoft.com/Displaying-list-of-upload-4957b5c8

csupload issue on 32 bit machines : Azure SDK 1.4 Fix

Thursday, March 10, 2011

Many of you might have encountered an issue while uploading your VM Role base image via csupload from a 32 bit machine.

The issue has now been resolved with the new release of Azure SDK 1.4. Now csupload can be used from x86 platforms too.

There are many more additions to the Azure SDK 1.4, mainly including Windows Azure COnnect and Content Delivery Netwrok (CDN).

Installing Tomcat in Windows Azure

Wednesday, March 2, 2011

There are few options already available to install Tomcat on Windows Azure, which involve running some scripts that create a package and definition file for you that you can deploy to Windows Azure. However, i personally feel that we have a much easier solution for installing Tomcat.

The solution that i am discussing here makes use of the startup tasks in elevated privileges, which were introduced in Azure SDK 1.3.
  1. Download and install jre on your local system. Zip the jre folder and upload it blob.
  2. Download tomcat on your local system.
  3. Edit tomcat’s server.xml to configure specific ports and enable SSL
  4. If you need to deploy any java .war file in your tomcat, then copy this war file in tomcat’s webapps folder.
  5. Zip the tomcat folder an upload it to blob.
  6. Now create a worker role and enable TCP ports configured for tomcat.
  7. Add startup task in this worker role that does the following tasks:
    • Unpack jre zip file to local drive on azure.
    • Unpack our customized tomcat zip file to local drive on azure.
    • Set environment variable for JRE_HOME to the path where JRE was unpacked.
    • Set environment variable for CATALINA_HOME to the path where tomcat was unpacked.
    • Start tomcat.

It shouldn’t be difficult for you to implement the above steps yourself, however, i seem to have plenty of time today so let me explain these steps too.

Prepare Java
It shouldn’t be difficult for you to download java from oracle/sun site. Just download it and install it. It will create a folder on your local machine e.g. jre6. You need to zip this folder and upload it to your blob storage. You can obviously upload it to any other host too, but since we are discussing about Azure, so let’s keep it that way only.

Prepare and configure Tomcat
Now download your desired version of tomcat from http://tomcat.apache.org/ and unzip this folder on your local machine.
  • Setup Ports in Server.xml
    You will have to select the ports you want your tomcat to run on. Say for http you want port 80 and for https you need port 443.

    Go to the conf folder inside your tomcat folder and open file server.xml

    Search for the following line and replace 8080 with 80 and 8443 with 443:

     8080" protocol="HTTP/1.1" connectionTimeout="20000" redirectPort="8443" />

    After changing the ports this line will look like

    80" protocol="HTTP/1.1" connectionTimeout="20000" redirectPort="443" />

  • Enable SSL
    Again open the server.xml file and search for the following line:


    This will be commented by default. Uncomment it and change the port to 443. This line will look like as bellow:

    443" protocol="HTTP/1.1" SSLEnabled="true" maxThreads="150" scheme="https" secure="true" clientAuth="false" sslProtocol="TLS"/>

    For SSL we use a .pfx file with tomcat instead of using normal keystore file. I will assume here that you already have .pfx file for PKCS12 certificate. This certificate needs to be added to your Cloud Service in the azure portal. Copy this .pfx file to a folder inside your tomcat folder, say under webapps. And make the following changes to the above line in server.xml:

    443" protocol="HTTP/1.1" SSLEnabled="true" maxThreads="150" scheme="https" secure="true" clientAuth="false" sslProtocol="TLS" keystoreFile="\webapps\myservice.cloudapp.net.pfx" keystorePass="" keystoreType="PKCS12"/>

  • Deploy war files if required
    If you have some war files that need to be deployed with tomcat, then jut copy them under the webapps folder of tomcat. Now when tomcat would be started it will install these applications.
  • Upload tomcat
    Once done zip your customised tomcat and upload it to blob.

Worker Role 
Now create a new Cloud Service Project in Visual Studio and add a new Worker role to it.

  • Add Certificate
    Upload the certificate that you used for tomcat’s SSL to the portal or include it in your worker project.   
  • Enable TCP Ports
    You need to enable the TCP ports in your worker role that you configured for your tomcat. In our case these are port 80 and 443.




    public override bool OnStart()
            {
                // Set the maximum number of concurrent connections
                ServicePointManager.DefaultConnectionLimit = 12;

                // For information on handling configuration changes
                // see the MSDN topic at http://go.microsoft.com/fwlink/?LinkId=166357.

                TcpListener port80Listener = new TcpListener(RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["Tcp80"].IPEndpoint);
                TcpListener sslListener = new TcpListener(RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["TcpSSL"].IPEndpoint);

                port80Listener.Start();
                sslListener.Start();

                return base.OnStart();
            }

  • Startup Tasks
    Then as we do for startup tasks create a startup.cmd file in your worker role, mark it’s Build Action Property as Content and Copy to Output Directory to Copy Always.



    Your startup.cmd file will have the following commands:
    • Download tomcat.zip and unzip
      We will use GetFiles.exe developed by us for downloading tomcat.zip. It is just a console app that takes first argument a blob url and second argument is the path to save file. Then using another customised utility ZipUtility.exe we will unzip tomcat.zip in the C: drive to a folder tomcat.

      start /w ZipUtility.exe C:\tomcat.zip C:\

    • Download jre.zip and unzip
      Similarly download and unzip jre.

      start /w ZipUtility.exe C:\jre6.zip C:\

    • Set Environment variables
      We need to setup two environment variables.
      JRE_HOME needs to be set to your jre folder, which in our case is C:\jre6

      SET JRE_HOME=C:\jre6

      CATALINA_HOME needs to be set to your tomcat folder, which in our case is C:\tomcat

      SET CATALINA_HOME=C:\tomcat

      Note: Please note that you can set environment variables in your service definition file also using the Runtime section as below:

      <Runtime>
            <Environment>
              <Variable name="CATALINA_HOME" value="C:\tomcat"/>
            Environment>
      Runtime>

    • Start tomcat

      C:\tomcat\bin\startup.bat

So startup.cmd will look like below:


GetFiles.exe and ZipUtility.exe are custom console apps. They are also added to the worker role project with their  Build Action Property as Content and Copy to Output Directory to Copy Always.

Final Step
That’s it just deploy your package to the cloud. Make sure your hosted service has the certificate that you used for tomcat’s SSL.

Resources:


Changing Drive Letter of an Azure Drive (aka X-drive)

Monday, February 7, 2011

Sometimes it might be necessary that you want your Azure drive to be always mounted on a fixed drive letter. Consider a scenario of an Azure VM Role where you need to mount an azure drive for data persistance and your VM demands the same letter for you azure drive, e.q. you installed SQL Server on your VM Role and for mdf files you specified azure drive as path so as to make the data persist.

But now, we know that Azure drives are mounted on random-drive letters. To always have a fixed letter what you can do is that after your drive is mounted, you can change the drive letter to a fixed value using diskpart from within the windows service you use to mount the drive in VM Role, or from other part of code if you are not working with VM Role. Check this post to know how to mount Azure Drive in VM Role.

To get a basic idea on how to change drive letter using diskpart visit this Microsoft support link : http://support.microsoft.com/kb/928543

To change the drive letter of the mounted Azure Drive using diskpart, we will create a temporary file in local resource storage. This temp file will be used to store the current and target drive letters, and using this we can construct diskpart commands. Following code can be used to achieve the same:

   


//create temporary diskpart file
string diskpartFile = drive.CachePath + "\\diskpart.txt";

if (File.Exists(diskpartFile))
{
File.Delete(diskpartFile);
}

 
string driveLetter = drive.DriveLetter;
//start the process
using (Process changeletter = new Process())
{
changeletter.StartInfo.Arguments = "/s" + " " + diskpartFile;
changeletter.StartInfo.FileName = System.Environment.GetEnvironmentVariable("WINDIR") + "\\System32\\diskpart.exe";
changeletter.Start();
changeletter.WaitForExit();
}

File.Delete(diskpartFile);

Mounting Azure Drive in Azure Virtual Machine (VM) Role

Mounting an Azure Drive in Azure VM Role can be beneficial in many scenarios. As we all know that Azure VM Role is not persistent, so once you deploy a VM Role and it is restarted, all the data that was not the part of the base image is gone.

As we already know from a previous post that in a VM Role data is not persistent between restarts or hardware-failures, we need to identify a way to provide data persistence. As a solution we will keep the data that needs to be persisted on a mounted vhd, and then this vhd will be uploaded to a page blob. Once VM Role is deployed to Azure this page blob will be mounted as Azure Drive. I already talked about the need of Azure Drive in VM Role for data persistence in my other post on - "Data Persistance in Azure VM Role". Therefore, in this post we will focus on how to mount azure drive in VM Role.

As azure drives un-mount themselves in 30 minutes if the process that mounted is no longer alive, we will create a windows service that will mount the azure drive containing SQL data. This windows service would be setup to start automatically, and thus it will behave as a VM Role Adaptor. See this post to learn how to create VM Role Adaptor for startup tasks.

The windows service that we use will mount azure drive in it's OnStart method and would unmount it in its OnStop method. The code to mount and unmount is same as normal. To initialize cache you can specify any local path while mounting drive. The cache path and size can be configured in app.config of the service.



string cachePath = "C:\Resources\AzureDriveCache";
int cacheSize = 500;
CloudDrive.InitializeCache(cachePath, cacheSize + 20);    



Also, you must keep the following things in mind:

1. Make the Windows service to start by itself once installed, as described in a previous post regarding startup tasks in VM Role.
2. Set the Windows Service to Automatic(Delayed) start. The delayed start would ensure that VM Role has been done starting and loading all the changes it requires.
3. In the Recovery tab of the service set the service to Restart after all the failures.

Followers

 

2009 ·Techy Freak by TNB