20110221

Disconnecting users from MSSQL database

When trying to restore database from backup often error saying about connected users is displayed.
Script below kills all connections to given database.
Just replace XYZ with required database name.
Remember that all connections will be dropped without notice.

USE master

GO

DECLARE @userId varchar(10) 

DECLARE cur CURSOR READ_ONLY      
FOR 
SELECT request_session_id
FROM master.sys.dm_tran_locks     
WHERE resource_type = 'DATABASE'     
AND resource_database_id = db_id('XYZ')    
GROUP BY request_session_id

OPEN cur     

FETCH NEXT FROM cur INTO @userId      
WHILE (@@fetch_status <> -1)      
BEGIN      
 IF (@@fetch_status <> -2)      
 BEGIN      
  PRINT 'Killing connection ' + @userId      
  EXEC ('KILL ' + @userId)      
 END      
 FETCH NEXT FROM cur INTO @userId      
END      
      
CLOSE cur      
DEALLOCATE cur      

20110210

Deploying Sitecore CMS to Windows Azure using VM Role

Sitecore CMS can be deployed to Microsoft Azure using Sitecore CMS Azure Edition, but as a proof of concept I gave it a try to upload it using Virtual Machine (VM) Role.

Using exactly same procedure as with EPiServer you can deploy Sitecore CMS. Only change required was updating aspnet_Membership_GetNumberOfUsersOnline procedure as in original it's not compatible with Azure SQL. So after database migration additional query with code placed below is required:

SET ANSI_NULLS ON
SET QUOTED_IDENTIFIER OFF
IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[aspnet_Membership_GetNumberOfUsersOnline]') AND type in (N'P', N'PC'))
BEGIN
EXEC dbo.sp_executesql @statement = N'CREATE PROCEDURE dbo.aspnet_Membership_GetNumberOfUsersOnline
    @ApplicationName            nvarchar(256),
    @MinutesSinceLastInActive   int,
    @CurrentTimeUtc             datetime
AS
BEGIN
    DECLARE @DateActive datetime
    SELECT  @DateActive = DATEADD(minute,  -(@MinutesSinceLastInActive), @CurrentTimeUtc)

    DECLARE @NumOnline int
    SELECT  @NumOnline = COUNT(*)
    FROM    dbo.aspnet_Users u WITH (NOLOCK),
            dbo.aspnet_Applications a WITH (NOLOCK),
            dbo.aspnet_Membership m WITH (NOLOCK)
    WHERE   u.ApplicationId = a.ApplicationId                  AND
            LastActivityDate > @DateActive                     AND
            a.LoweredApplicationName = LOWER(@ApplicationName) AND
            u.UserId = m.UserId
    RETURN(@NumOnline)
END' 
END

20110209

Deploying EPiServer CMS to Windows Azure using VM Role

Intro


This post describes running EPiServer CMS version 6 on the Windows Azure using Virtual Machine (VM) Role Beta Program. The main goal was to take developer Hyper-V machine and deploy it to the cloud as staging environment.

Prerequisites



Creating VHD


First Windows Server 2008 R2 is required - I've used Windows Server 2008 R2 Enterprise (Full Installation). Additionally it should be installed on a single partition, which is not the default behaviour. There are few posts on how to remove this additional partition but sysprep (described later) fails after those operations - it can be done only during system installation.

  1. select Custom (Advanced) instead of Upgrade
  2. on partition selection screen press Shift + F10 to open console
  3. run diskpart
    • list disk
    • select disk 0
    • create partition primary
    • exit
  4. close console
  5. refresh
  6. select created partition

After installation go to Server Manager:
  1. add Role: Web Server (IIS)
  2. add Feature: .NET Framework 3.5.1 (inside .NET Framework 3.5.1 Features node)
You may need to switch off Windows Update, but it should be inactive by default.
You may backup Virtual Machine now to reuse it later.

Installing Application


Now it's time to install EPiServer or any other application you'd like to deploy to the could. I've used empty EPiServer version 6 with Public Templates. Only web.config change was to switch to SQL Role and Membership providers.

Setting up database


Local database needs to be migrated to Microsoft SQL Azure Database - there's quite simple way to do it using SQL Azure Migration Wizard

Because this tool doesn't clean tables previously filled with data this script can be used to remove all tables if another run is required for any reason.

You should modify connectionStrings.config to connect to Azure SQL instance and test if application works.

Finalizing VHD


It's highly recommended to backup Virtual Machine now.
Further steps caused it hang during start up until in cloud.

Mount C:\Program Files\Windows Azure SDK\v1.3\iso\wavmroleic.iso as DVD drive and install WaIntegrationComponents-x64.msi. Provide it with valid Administrator password and let it install all the drivers it needs. After reboot run %windir%\system32\sysprep\sysprep.exe enter options as on image below and let Virtual Machine shutdown - now VHD image is ready for deployment.


If you ever start this Virtual Machine again (which is not obvious to succeed) you need to launch sysprep once more.

Creating Management Certificate


  1. Start Visual Studio
  2. create project of type C# / Cloud / Windows Azure Project.
  3. click Publish
  4. Select 'Deploy your Windows Azure project to Windows Azure'
  5. Credentials - select Add... and name it
  6. Click 'Copy the full path' to get Public key path and upload it to Azure



  7. Copy subscription ID back to Management Authentication dialog and click OK - it will be verified.
  8. You can now forget about this project :).

Uploading VHD


Open a Windows Azure SDK Command Prompt as an administrator from Start | All Programs | Windows Azure SDK v1.x.
Run command below replacing XXXXX and YYYYY with your Subscription Id and Certificate Thumbprint respectively. You should of course provide path to VHD and destination location to reflect your needs.

csupload Add-VMImage -Connection "SubscriptionId=XXXXX; CertificateThumbprint=YYYYY" -Description "Base image Windows Server 2008 R2" -LiteralPath "D:\VM\AzureTest\Virtual Hard Disks\Clean win 2008 r2.vhd" -Name baseimage.vhd -Location "North Europe"



Anywhere locations are not supported (i.e. Anywhere Europe and Anywhere US).

Statuses visible in Management Portal while uploading:
  • pending means uploading
  • committed means ready

Creating the Service Model (Visual Studio part)



  1. Start VS2010
  2. create Windows Azure Project without any roles
  3. right-click Roles > Add > New Virtual Machine Role
  4. in properties of new role > Virtual Hard Disk tab > select account and uploaded VHD
  5. in Endpoints tab add:
    • name: HttpIn
    • type: Input
    • protocol: Http
    • public port: 80
    • private port: 80
  6. go to Publish
  7. Configure Remote Desktop (save certificate to file WITH the private key)
  8. select Create Service Package Only
  9. click OK

Creating the Service Model (Management Portal part)


  1. New Hosted Service - name it, give it URL prefix, choose region and 'Do not deploy'
  2. Add Certificate (select one created in previous step)
  3. New Production Deployment - name it and select two files created by VS (.cspkg and .cscfg)
  4. wait...
  5. wait more...
  6. wait even more...
  7. you can connect to IIS using URL on the right
  8. Connect on the ribbon downloads RDP file




Further works


All file system modification on Virtual Machine will be lost when it's reimaged from uploaded VHD. It may happen because of machine failure or restart. At the moment only data stored in Azure SQL is safe - every change to VPP will be lost. There are two options - to store VPP data in database or in Cloud Storage. There's a post on this by Daniel Berg but I haven't tried yet.

There's also licensing issue as EPiServer license is linked either to IP or MAC address. Unfortunately MAC address changes as well during deployment. At the moment I've generated new demo license using MAC address obtained from Virtual Machine but it will be lost. Not sure what can be done about it.

20100531

WCF service consumed by jOuery on IIS7

For this example assume you need to return Info structure (that consists of two strings: Title and Body) that can be queried using it's Id.

First you have to define data contract:

namespace Example
{
    [DataContract]
    public class Info
    {
        [DataMember]
        public string Title;

        [DataMember]
        public string Body;
    }
}

Then interface of the service - it also defines request and response format as JSON (which can be easily consumed in JavaScript).

namespace Example
{
    [ServiceContract(Namespace = "")]
    public interface IGetInfoService
    {
        [OperationContract]
        [WebGet(ResponseFormat = WebMessageFormat.Json, RequestFormat = WebMessageFormat.Json)]
        Info GetInfo(int id);
    }
}

Service logic just to provide full example :)

namespace Example
{
    [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)]    
    public class GetInfoService: IGetInfoService
    {       
        public Info GetInfo(int id)
        {
            return
                new Info 
                    {
                        Title = id.ToString(),
                        Body = "Lorem ipsum"
                    });
        }
    }
}

WCF service requires also web.config section:

<system.serviceModel>
        <services>                        
                <service behaviorConfiguration="GetInfoServiceBehavior" name="GetInfoService">
                        <endpoint address="" behaviorConfiguration="GetInfoEndpointBehavior" binding="webHttpBinding" contract="Example.IGetInfoService">                                       
                                </endpoint>
                                <endpoint address="mex" binding="mexHttpBinding" contract="IMetadataExchange" />
                </service>
        </services>
               
        <behaviors>
                <serviceBehaviors>
                        <behavior name="GetInfoServiceBehavior">
                                <serviceMetadata httpGetEnabled="true" />
                                <serviceDebug includeExceptionDetailInFaults="false" />                                                                    
                        </behavior>
                </serviceBehaviors>
                <endpointBehaviors>
                        <behavior name="GetInfoEndpointBehavior">
                                <enableWebScript />
                        </behavior>
                </endpointBehaviors>
        </behaviors>
        <serviceHostingEnvironment aspNetCompatibilityEnabled="true" />
</system.serviceModel>

Consume service using jQuery $.ajax call.
Params is object that handles all parameters and msg in success function

var params = new Object;
params.id = 10;

$.ajax({
 type: "GET",
 url: "/Services/GetTheFactsService.svc/GetNextFact",
 data: params,
 dataType: "json",
 success: function (msg) {
  $(".info h3").html(msg.d.Title);
  $(".info p").html(msg.d.Body);
 },
 error: function (msg) {
  alert(msg);
 }
});

If ISS reports 404 or 500 error results you probably need to register WCF:

c:
cd "C:\Windows\Microsoft.NET\Framework\v3.0\Windows Communication Foundation"
servicemodelreg -i
Command above registers *.svc handlers for host so if you have other defined for site you have to remove them or go to Handler Mappings section and use Revert To Inherited action.

20100423

Lucene large queries efficiency

Reading large data sets from Lucene even after adding document cache can still be slow due to algorithm that loads only 100 results and queries again each time loaded amount is smaller than required. Documentation advises to use HitCollector to speed up but it's hard to find an example. Big disadvantage of this approach is no sorting support when HitCollector is used.
Below you can find example of LuceneHitCollector storing items in LuceneHitCollectorItems and using LuceneSorter.
LuceneSorter is based on internal Lucene sort algorithm and therefore uses it's dictionaries built while directory is created. It implements only subset of Lucene capabilities:
  • sorting by rank (when empty string is passed to Sort method)
  • sorting by string field
  • reverse order sorting

Last code example shows how to use HitCollector. It's as easy as using Lucene itself and again much faster.

public class LuceneHitCollectorItem
{
    public float Score;
    public int DocId;
    public Document Doc;
}

public class LuceneHitCollector : HitCollector
{        
    private readonly IndexReader reader;
    private List<LuceneHitCollectorItem> docs = new List<LuceneHitCollectorItem>();

    public LuceneHitCollector(IndexReader reader)
    {
        this.reader = reader;
    }

    public override void Collect(int doc, float score)
    {
        Document document = reader.Document(doc);
        docs.Add(
            new LuceneHitCollectorItem
                {
                    DocId = doc,
                    Doc = document,
                    Score = score
                }
            );
    }

    public void Sort(string field, bool reverse)
    {
        LuceneSorter sorter = new LuceneSorter(reader, field, reverse);
        sorter.Sort(docs);
    }

    public int Count
    {
        get
        {
            return docs.Count;
        }
    }

    public LuceneHitCollectorItem Doc(int n)
    {
        return docs[n];
    }
}

public class LuceneSorter
{
    private string field;
    private bool reverse;
    private StringIndex index;

    public LuceneSorter(IndexReader reader, string field, bool reverse)
    {
        if (field != "")
        {
            String fieldName = String.Intern(field);
            index = FieldCache_Fields.DEFAULT.GetStringIndex(reader, fieldName);
        }
        this.field = field;
        this.reverse = reverse;
    }

    public void Sort(List<LuceneHitCollectorItem> docs)
    {
        if (String.IsNullOrEmpty(field))
        {
            docs.Sort(CompareScore);
        }
        else
        {
            docs.Sort(CompareField);
        }
    }

    private int CompareField(LuceneHitCollectorItem a, LuceneHitCollectorItem b)
    {
        int result = index.order[a.DocId] - index.order[b.DocId];
        if (result == 0)
        {
            result = a.DocId - b.DocId;
        }
        return (reverse) ? -result : result;
    }

    private int CompareScore(LuceneHitCollectorItem a, LuceneHitCollectorItem b)
    {
        int result = a.DocId - b.DocId;
        if (a.Score != b.Score)
        {
            result = (a.Score > b.Score) ? 1 : -1;
        }
        return (reverse) ? -result : result;
    }
}

LuceneHitCollector collector = new LuceneHitCollector(reader);
searcher.Search(query, collector);
collector.Sort(field, reverse);
for (int index = 0; index < hits.Count; index++)
{
    LuceneHitCollectorItem item = hits.Doc(index);
    Document doc = item.Doc;
    double rank = item.Score;
    //process document
}

Speeding up Lucene Index

Lucene is indexing and search technology, as well as spellchecking, hit highlighting and advanced analysis/tokenization capabilities. .NET version is automatic Java Lucene port.
Solution works perfect in most cases but recently I've found it too slow. Directory can be stored in file system or in memory. While search indexes are cached in memory document contents are loaded each time from directory. I haven't checked Java implementation but .NET seems to parse file very slow. File system version requires additionally lots of disc access, which is probably well cached by operating system but not efficent for large indexes (in my case file size is over 500MB). Memory directory implementation tends to leak which is unacceptable.
Solution below demonstrates how to enable document caching in Lucene.
Cache needs to be added to FieldsReader class. Code below shows only changes that needs to be made:
  • Cache directory needs to be added.
  • Cache needs to be clear at the end of Close() method.
  • Document is returned from cache if already loaded at the beginning of Doc() method.
  • Document is added to cache after read in Doc() method.
This change has big memory cost but speeds up large queries even up to 10 times.
Further optimization can be found here.

public sealed class FieldsReader
{
    private SortedList<int,Document> cache=new SortedList<int, Document>();
//---------
    public void Close()
    {
//---------
        cache.Clear();
    }
//---------
    public Document Doc(int n)
    {
        if (cache.ContainsKey(n))
        {
            return cache[n];
        }
//---------
        if (!cache.ContainsKey(n))
        {
            cache.Add(n, doc);
        }
        return doc;
    }
//---------

20100326

Listing your Twitter messages by RSS

Frequently it's useful to list your Twitter messages on web other website or process them in some other way. There are billions of plugins that will allow you to do it but if you need to download them manually it's best to use RSS. You don't need to use API so you don't need to register. Only disadvantage is that you need user ID not his name. You can obtain it from user page - there is a RSS link in right hand column - number in that link is user ID that needs to be passed to the code.

Code below consists of three classes that will get twitts for you. There is also caching so remember that new messages will be fetched only every UPDATE_EVERY_MINUTES minutes.

Single message:

public class TwitterCacheItem
    {
        public string Title {get;set;}
        public string Link {get;set;}
        public string Description {get;set;}
    }

All of them:

public class TwitterCacheChannel : TwitterCacheItem
    {
        public TwitterCacheChannel()
        {
            Items = new List();
            Loaded = false;
        }

        public DateTime NextUpdateTime { get; set; }
        public bool Loaded { get; set; }
        public List Items { get; private set; }
    }

And finally the code:

public class TwitterCache
    {
        private const int UPDATE_EVERY_MINUTES = 60;

        private static SortedList cache = new SortedList();
        private static object cacheLock = new object();


        public static TwitterCacheChannel GetTwitts(int id)
        {
            TwitterCacheChannel result = null;
            lock (cacheLock)
            {
                if (cache.ContainsKey(id))
                {
                    result = cache[id];
                }
            }
            if ((result==null)||
                (result.NextUpdateTime<DateTime.Now))
            {
                result = FetchTwitts(id);
            }
            return result;
        }

        private static string FetchTwittsUrl(int id)
        {
            return String.Format("http://twitter.com/statuses/user_timeline/{0}.rss", id);
        }

        private static TwitterCacheChannel FetchTwitts(int id)
        {
            TwitterCacheChannel result = new TwitterCacheChannel();
            if (id > 0)
            {
                try
                {
                    string url = FetchTwittsUrl(id);
                    string rssText = DownloadWebPage(url);
                    XmlDocument rss = new XmlDocument();
                    rss.LoadXml(rssText);
                    XmlElement channel = rss["rss"]["channel"];
                    result.Title = channel["title"].InnerText;
                    result.Link = channel["link"].InnerText;
                    result.Description = channel["description"].InnerText;
                    foreach (XmlElement node in channel.SelectNodes("item"))
                    {
                        TwitterCacheItem item = new TwitterCacheItem();
                        item.Title = node["title"].InnerText;
                        item.Link = node["link"].InnerText;
                        item.Description = node["description"].InnerText;
                        result.Items.Add(item);
                    }
                    result.Loaded = true;
                }
                catch
                {
                }
            }
            result.NextUpdateTime = DateTime.Now.AddMinutes(UPDATE_EVERY_MINUTES);
            lock (cacheLock)
            {
                if (cache.ContainsKey(id))
                {
                    cache[id] = result;
                }
                else
                {
                    cache.Add(id, result);
                }
            }
            return result;
        }

        private static string DownloadWebPage(string url)
        {
            HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url);
            request.UserAgent = ".NET Framework/2.0";
            request.Referer = "http://www.example.com/";
            WebResponse response = request.GetResponse();
            Stream webStream = response.GetResponseStream();
            StreamReader reader = new StreamReader(webStream);
            string result = reader.ReadToEnd();
            reader.Close();
            webStream.Close();
            response.Close();
            return result;
        }
    }