Social Icons

twitter google plus linkedin rss feed

Pages

22.9.15

How To Clone a Virtual Machine in Azure

This is quite simple, once you know how to do it but... when you don't...

My favourite way of cloning virtual machines has always been copying the hard disks and deploying the virtual machine in a different network segment. We do have all the ingredients we need for this recipe in Azure so let's get to it!

You will need:

  • The blob that contains the vhd you want to clone
  • A new Cloud Service (or one where the source virtual machine is not running)
  • Notepad.exe (or similar, although nothing beats the original)
  • Azure PowerShell
  • Attention to detail
  • Just a bit of patience
Let's start.

The first thing you need to do is to find out the disk(s) you want to clone. In the virtual machine's dashboard page, if you scroll down a bit, you will be able to see them.

After you scroll down do not forget to scroll right too, the URL we need is a bit hidden... copy it in your notepad.

This is the address of the blob that contains the disk and in it you can find:

http://[yourStorageAccount].blob.core.windows.net/[yourContainerName]/[yourBlobName].vhd

Finally you need  the primary access key for your storage account. 

Go to Storage, select your Storage Account and look on the footer of the page for the key icon:


There copy your primary key and paste it to your notepad.

And that's all you need.

#From http://michaelwasham.com/windows-azure-powershell-reference-guide/copying-vhds-blobs-between-storage-accounts/
#And after that from http://www.codegrimoire.com
####################################################################################################################
### Source VHD - anonymous access container ###
$storageUri = "http://yourStorageAccount.blob.core.windows.net/"
$containerName = "yourContainerName" ##I am using the same source and destination here
$sourceBlobName = "yourBlobame.vhd"

### Destination Blob
$destBlobName = "newBlobName.vhd"

######### New Disk Name
$newDiskName =  "newDiskName"
$newDiskLabel = "BootDisk" ##In the documentation it is either BootDisk or DataDisk but you can call it something else
$isOsDisk = $true ##$true Or $false. As I only work with windows I will not bother about other OSs in this script

### Target Storage Account ###
$storageAccount = "yourStorageAccount"
$storageKey = "yourPrimaryKey" ##Primary Access Key


###################################################################################
############  Automated script for creating the new VHD  ##########################
###################################################################################

$srcUri = ($storageUri.trim('/'), $containerName.trim('/'), $sourceBlobName) -join '/'
$destUri = ($storageUri.trim('/'), $containerName.trim('/'), $destBlobName) -join '/'
 
### Create the destination context for authenticating the copy
$destContext = New-AzureStorageContext  –StorageAccountName $storageAccount -StorageAccountKey $storageKey  
 
### Create the target container in storage
### Not necessary as i am using an existing one ### New-AzureStorageContainer -Name $containerName -Context $destContext 
 
### Start the Asynchronous Copy ###
$blob1 = Start-AzureStorageBlobCopy -srcUri $srcUri -DestContainer $containerName -DestBlob $destBlobName -DestContext $destContext

### Loop until complete ###                                    
Do{
  $status = $blob1 | Get-AzureStorageBlobCopyState 
  ### Print out status ###
  $status.Status
  Start-Sleep 5
}While($status.Status -eq "Pending") ##This doesn't work as you would expect but the idea is good and maybe they will change the way Get-AzureStorageBlobCopyState works :)


######## After the new blob has been created we will add the new disk #############
if ($isOsDisk){
    Add-AzureDisk -DiskName $newDiskName -MediaLocation $destUri -Label $newDiskLabel -OS "Windows"
}
else{
    Add-AzureDisk -DiskName $newDiskName -MediaLocation $destUri -Label $newDiskLabel
}

Once you edit the script with your data and run it you will have to wait for a couple of minutes before the new disk is available. You do not need to turn off the source virtual machine although it's better safe than sorry

After that go to Virtual Machines, New, Compute, Virtual Machine, From Gallery and choose My disks in the lower part of the left column:

The disk you have just created will appear in the list and after that you just need to create the new virtual machine normally.

By the way, we need an Azure + SharePoint administrator in London, do you oblige?


Is it a good idea to call the cloned VMs Dolly?

No comments:

Post a Comment

25.8.15

Windows 10 after the first month

Actually I had W10 installed in a preview back on January and only because I wanted to be careful not installing the first insider preview but then I found some very annoying bugs that made me uninstall it and go back to W8.1. Finally and  it looks like I mean Literally Finally as they say this is the last version of Windows ever I reinstalled The Real W10 on the 29th... And realised much more acquiescent than surprised that the bugs that made me uninstall it were already there.

It's pretty obvious that my bugs are due to my particular set up and they are nothing you need to worry about. Who would have two monitors daisy chained to a Surface 3 Pro and a HiFi system hooked to them? If you have that type of set up you probably deserve loosing the image in your monitors and waking up every morning without sound.

I already know how to work around these issues. If your monitors stop working you need to unplug them off from the wall (turning them off or disconnecting the DP cable will not do, you need to pull the plug off the wall) and back on again. That plus a reset fixes it. About the sound it's either changing the quality of the sound or restarting the machine but enough of my old man whining already.

I have been working with Windows 10 in my everyday computer for three or four months and the single thing that makes me love the system every day is the fact that is Desktop-Centrinc this is my innovative use of the English language again.

Windows 8 was based in apps and the operative system wanted to push you to use the apps first... Even the desktop was an app... The desktop... AN APP!? the more I think about it the more outrageous it feels. It kind of makes sense if you have a tablet but 99% of people who works with a computer does it sitting on their offices with two 24" monitors. The desktop an app ... that's nonsense.. an app....

Windows 10 runs from the desktop (in desktop computers) and that small change makes everything make sense again.

Your computer starts with your familiar desktop and there you have all your familiar applications you feel like home. Suddenly you need to download something... you can either use your familiar browser to go to the wilderness that it is internet and download a random piece of software from a random server that will have access to most of the resources in your computer without you noticing after installing a bar in the browser or you can go to the now natural place to download software: The Store.

The apps from the store work now as windows in the desktop. You can maximise them as you always could, you can stick them to a side of the screen as you always could and you can use it in all the usual ways with all the freedom you are used to. That is fantastic. That makes the boundary between a mobile device like a phone or a tablet and your desktop disappear. You can use the same app everywhere but if you are in your desktop you will be able to run it right besides your beloved Winamp (Why Winamp? because you always dreamed secretly about whipping a llama's ass).

I do not consider myself a common computer user, in fact the vast majority of the days I use just the remote desktop and maybe the browser for reading the news if I'm in that mood but since I have W10 I have downloaded a couple of apps that I use sometimes and the more apps you download and use the more natural it feels.

With Windows 10 the ability to run the apps in Windows phones is not be the driver to migrating programs to apps any more. Windows programs are so from the 80's...

If the integration between apps and EXEs is so seamless and apps are the coolest thing around, Why have not all the companies migrated all the code from the last 10 years to apps already? After all it has been a month since they released Windows 10...

Because it took them 10 years to create that code in the first place.

I'm not saying that migrating an app would take the same amount of time than creating it from scratch but it's still an investment. It will mean migrating, at least, the whole user experience and, How often do programs change their user interface?


I suspect the risk adverse software companies (or the bigger ones that take longer to react) will wait until they need an interface change to adopt the new paradigm while the new software companies and the more dynamic ones will embrace the apps world as soon as they release a new version.

What do you think?

No comments:

Post a Comment

29.7.15

Where is my Windows 10 Upgrade?

That's obviously the first thing I asked myself today :)

It's in your desktop windows update!

Right click on the Start button and open the Control Panel.


Once there look for Update


Finally click on Windows Update and...


I am So Excited! Yes, I am that kind of person

Well... almost 3GB... I can't wait... I'll see you in a while... I can't wait!

No comments:

Post a Comment

9.7.15

Is JavaScript The Emperor's New Clothes?

Script (computing), a small non-compiled program written for a scripting language or command interpreter.

If JavaScript has Script in its name, what made you think it would be a suitable language to base the whole web and half of the mobile software on it?

If JavaScript: The good parts is a best seller and it's only a 150 pages manual (50 are appendixes and only 2 are about beautiful features and I am pretty sure even Mr Crockford found it excruciating to write so many)

If you are required to migrate from one version to another every now and then because the software provider will only support your bits for the next five years, What makes you think it's a good idea to download and install in the core of your software a random minified js file that you can't read and you don't know where it comes from and, of course, was forgotten by its developer five days after he published it?

Will I work with it?

Of course I will, in fact I have been doing it for ages, just not at this level. If you can do anything with a Turing machine why not embracing a script language for making applications of thousands of lines? That's what the brave would call a challenge.

I have been creating classes and constructors and inheritance and it all can be done in JS but I am pretty sure translating ancient OO structures and patterns to JS is not the right way, probably that's my problem. Instead of embracing the new language I am trying to translate my old jokes. That will probably change with the time as I learn the new ones.

The fact that we have no other option plays a role here too. If you are working for the web you are free to use JavaScript or not to work for the web Ã  la Apple.

Do I understand its advantages?

The learning curve is great as the basics are simpler. No types no safety net. No need of an IDE no great IDEs either, if you are used to Visual Studio you'll love loosing all the features you are used to. It can be executed everywhere if they have the right browser and that's not the case in many big companies. And a great community of fans Ã  la Apple.

Do I like JavaScript?

Do I like a piece of technology that, if we are lucky, will solve 5 years in the future all the problems we solved with Silverlight eight years ago?

I probably will when both me and the technology are mature enough.

Of course I do, I want to be cool...


I learnt to love WPF but it didn't take this long.

No comments:

Post a Comment

11.6.15

Migrating SharePoint Users to a New Domain

I have been dreading this type of migration for years... so many years that I already had planned a couple of ways of solving the issue. It finally happened.

Scenario:

Someone decides we need to change the farm from one environment to a new one completely different with a new AD and in a new city.

Well, let's get to it. We have created a new SharePoint farm in the new environment and we have backed up and restored the content databases. We have manually changed the admin of the site collection to the new admin in the new AD in the new farm and we can access the site and see the data. Fantastic.

Fantastic?

The users in the list items are the users from the old farm. And we have several lists with a lot of user fields. And some of our lists have tens or hundreds of thousands of rows. Changing them manually is not an option.

First idea: Go refined and try stsadm -o migrateuser:

Ohh so easy... we change the login name of the user to something else and we are good because the user IDs are still the same... NO.

This is a new domain and we don't have access to the old domain users so the migrateuser parameter throws a nice User not found error.

Second idea: Go berserk and change the strings in the list items

And that worked. Oh the beauty of a simple idea. The process is pure brute force... beautiful in its barbarity... If you have read up to here you are probably desperate for a solution.

Step One:
Get all of the users from the old farm in an XML file or something really high tech (a csv could work too).

static void Main(string[] args)
{
    using (SPSite site = new SPSite(args[0]))
    {
        using (SPWeb web = site.OpenWeb())
        {
            XElement users = new XElement("Users");

            foreach (SPUser user in web.SiteUsers)
            {
                XElement xmlUser = new XElement("User");
                xmlUser.Add(new XAttribute("Name", user.Name));
                xmlUser.Add(new XAttribute("LoginName", user.LoginName));
                xmlUser.Add(new XAttribute("ID", user.ID));

                users.Add(xmlUser);
            }

            users.Save("SiteUsers.xml");
        }
   
    }

    Console.WriteLine("\nProcess finished...");
    Console.ReadLine();
}


Step Two:
Make sure all the users you need are in the new AD. As you have a list in XML you can pass it to someone with privileges in the AD.

Step Three:
Ensure the users in SharePoint, add them to a group with reading permissions and then iterate through all the items in the list changing the users from the old domain to the users in the new one.

static void Main(string[] args)
{
    XElement users = XElement.Load("SiteUsers.xml");
    string newDomain = "XXXXXXXX";

    string ListName = string.Empty;
    if (args.Length == 2) ListName = "Stratex Framework";
    else ListName = args[2];

    //Args are SiteUrl VisitorsGroup ListName
    using (SPSite site = new SPSite(args[0]))
    {
        using (SPWeb web = site.OpenWeb())
        {
            SPList listToUpdate = web.Lists[ListName];

            Dictionary<string, SPUser> NewUsers = new Dictionary<string, SPUser>();

            foreach (XElement user in users.Descendants("User"))
            {
                string LoginName = FixDomain(user.Attribute("LoginName").Value, newDomain);
                SPUser spUser = null;
                try
                {
                    //We try to ensure all the users from the XML file. We'll probably need them
                    spUser = web.EnsureUser(LoginName);
                }
                catch
                { Logger.WriteLine("The user {0} could not be found.", LoginName); }

                if (spUser != null)
                {
                    SPGroup viewers = web.Groups[args[1]];

                    viewers.AddUser(spUser);
                    //Finally we add them to a group with read permissions
                    //We can worry about restricting this further after the migration

                    if (!NewUsers.ContainsKey(LoginName)) NewUsers.Add(user.Attribute("ID").Value, spUser);
                }
            }

            web.Update();


            UpdateUsersInList(listToUpdate, NewUsers);
        }

    }

    Logger.WriteLine("\nProcess finished...");
    Console.ReadLine();
}

private static void UpdateUsersInList(SPList list, Dictionary<string, SPUser> NewUsers)
{
    int itemsInList = list.ItemCount;
    Logger.WriteLine("Updating users at {0}. {1} items.", list.Title, itemsInList.ToString());

    SPQuery qry = new SPQuery();
    qry.ViewAttributes = "Scope=\"RecursiveAll\"";
    SPListItemCollection allItems = list.GetItems(qry);
    int count = 0;

    UpdateCount(count++, itemsInList);

    foreach (SPListItem item in allItems)
    {
        try
        {
            bool changed = false;
            SPFieldCollection allFields;
            //If the item has content type it has usualy less fields
            if (item.ContentType == null)
                allFields = item.Fields;
            else
                allFields = item.ContentType.Fields;

            foreach (SPField field in allFields)
            {
                if (field is SPFieldUser)
                    changed = ChangeUserToNewDomain(item, field, NewUsers) || changed;
            }

            changed = ChangeUserToNewDomain(item, item.Fields.GetFieldByInternalName("Author"), NewUsers) || changed;
            changed = ChangeUserToNewDomain(item, item.Fields.GetFieldByInternalName("Editor"), NewUsers) || changed;

            if (changed) item.SystemUpdate(false); //if the item has not been changed we won't update it to save time
        }
        catch (Exception ex) { Logger.WriteLine("Failed to update item {0}. Exception {1}", item.Title, ex.Message); }

        UpdateCount(count++, itemsInList);
    }

    UpdateCount(count++, 0);
}

private static bool ChangeUserToNewDomain(SPListItem item, SPField field, Dictionary<string, SPUser> NewUsers)
{
    bool changed = false;
    string fieldContent = item[field.InternalName] == null ? null : item[field.InternalName].ToString();

    if (string.IsNullOrEmpty(fieldContent)) return false;

    List<string> oldUserIds = GetUserIDs(fieldContent.Split(new string[] { ";#" }, StringSplitOptions.RemoveEmptyEntries));

    if (oldUserIds.Count == 1)
    {   //The field has only one user in it
        SPFieldUserValue foundUser = FindUser(NewUsers, oldUserIds[0]);

        if (foundUser != null)
        {
            item[field.InternalName] = foundUser;
            changed = true;
        }
    }
    else if (oldUserIds.Count > 1)
    {   //The field has several users in it
        SPFieldUserValueCollection usersInField = new SPFieldUserValueCollection();
        foreach (string oldUser in oldUserIds)
        {
            SPFieldUserValue foundUser = FindUser(NewUsers, oldUser);

            if (foundUser != null)
                usersInField.Add(foundUser);
        }

        if (usersInField.Count > 0)
        {
            item[field.InternalName] = usersInField;
            changed = true;
        }
    }
            
            
    return changed;
}

private static List<string> GetUserIDs(string[] UserTokens)
{   //We do not care about the login name. The ID is gold
    List<string> result = new List<string>();

    if (UserTokens.Length > 0)
    {
        for (int i = 0; i < UserTokens.Length; i++)
        {
            int id;

            if (i % 2 == 0 && int.TryParse(UserTokens[i], out id))
                result.Add(id.ToString());
        }
    }

    return result;
}

private static SPFieldUserValue FindUser(Dictionary<string, SPUser> NewUsers, string oldUser)
{
    SPUser foundUser = null;

    if (NewUsers.ContainsKey(oldUser)) foundUser = NewUsers[oldUser];
    else
    {
        //If we can't find the ID of the user we will still try with the login or even with the Display Name
        foreach (SPUser newUser in NewUsers.Values)
        {
            if (newUser.Name == oldUser || newUser.LoginName == oldUser) { foundUser = newUser; break; }
        }
    }

    if (foundUser != null)
        return new SPFieldUserValue(foundUser.ParentWeb, foundUser.ID, foundUser.Name);
    else
        return null;
}

private static string FixDomain(string loginName, string newDomain)
{
    //Here we change the users from XXXXX\\User to YYYYY\\User
    //The source domain was claim based
    if (loginName.Contains("|")) loginName = loginName.Split('|')[1];

    string[] tokens = loginName.Split('\\');

    tokens[0] = newDomain;

    return string.Join("\\", tokens);
}

private static void UpdateCount(int currentItem, int itemsInList)
{
    int percentage;

    if (currentItem == 0) percentage = 0;
    else if (itemsInList == 0) percentage = 100;
    else
    {
        //We will only change the value every 10 times to make the process faster.
        if (currentItem % 10 != 0) return;
        percentage = currentItem * 100 / itemsInList;
    }
    Console.Write("\r");
    if (percentage >= 0 && percentage < 10)
        Console.Write("  ");
    else if (percentage >= 10 && percentage < 100)
        Console.Write(" ");

    Console.Write("{0}%", percentage);
}

This is a first prototype that has worked as expected but it's not fully tested (by far) if you need it you can use it as a base to develop your own tool.

The one who possesses the strings has the power.

No comments:

Post a Comment

10.6.15

Why Windows 10 will be a game changer?

Do you remember having to meet at your friend's house to ask him to lend you his Zanac cassette tape?

Do you remember buying a double cassette player to be able to copy the game just for testing purposes?

The floppy disks arrived then and we had those huge bendy disks of 5 1/4 they were really fast and their capacity was amazing but they didn't change the fact that you needed to go to your friend's house to get a copy of Alley Cat.

Then the internet arrived. And you still had to go to your friend's house to get the diskette if Ishar because it took much less time to get to the other side of the planet walking than downloading 1MB.

But the internet was a game changer. After some time you completely forgot about 3,5 diskettes and you would only visit your friend to get the CD of Dungeon Keeper II or something like that.

And some time later (today) you can download 60GB virtual machines in minutes. It looks like the moment proper internet arrived I stopped enjoying computers...

Windows 10 is similar when it comes to changing the paradigm of the distribution of the software and I will explain my point.


Most of the programs and games for desktops were created for Windows and not for Linux or Mac and there's a good reason for that. The market share of the windows desktop made working for any other platform a waste of money.

The the market places arrived. And instead of going out to the scary internet to download you programs you would go to a supervised environment where you could download your programs safely.

BUT

When it comes to desktops only a "ridiculous" 16.45% of the computers (W8 + 8.1) can execute market applications while the vast majority of desktops (W7 + XP) a 72.36% are still stuck in the old paradigm. (source)

And the question comes again... Why develop an application for the 16% of the computers when, for the same amount of money I can create the same app for the 88.81% (given that W8 and 8.1 are compatible with the old desktop code)? It's a no brainer.

And here comes Windows 10.

Offering Windows 10 as a free upgrade will surely convince most of the users out there with W7, 8 and 8.1 to get the latest bits and with them the ability to be clients of this new market.

Not only that. Given that Windows 10 apps will also work in any hardware capable of executing W10 the target audience will be not only increased but probably multiplied.

You will develop an app once and it will be downloadable by the desktop users, but also by the Windows Phone users, the XBox users, the Raspberry Pi users... the Hololens users! you name it.


The estimations of Microsoft about this is that in a couple of years there will be one billion W10 devices out there and then the question will be again:

Why would anyone develop for any other platform?

No comments:

Post a Comment

4.6.15

Geeky T-Shirt For Free... Count me on Xamarin!

I firmly believe that once the majority of the desktop computers of the world are capable of running Microsoft Store applications (Windows 10 will be released in July 29) the natural way for developing anything will be using the Microsoft Store I usually get excited whenever Microsoft says anything, excited or angry, but mostly excited because I am quite a naïve guy.

But even though, in the same way some android and IOS developers port their applications to the Microsoft Store, we might need to port our applications to other platforms, the other markets will be ridiculously small, but still.


I have been waiting for the right opportunity to give Xamarin a go and now, for a free t-shirt that we are exploring new platforms for a new tablet project, seems like the right moment.

The installation process in my Surface 3 Pro has not been easy. I struggled to make it run in VS2015 and started using the Xamarin Studio. I had one small issue with XS, nothing that 30 second in the Xamarin forums could not fix. All looked good, but then when I tried starting the application in the virtual devices I couldn't. Using real hardware was not an option because I only have Windows Phones at home 6.1 to 10.

I found the solution getting and installing the Xamarin Player, after that which also installed VBox, I downloaded and configured a Nexus 4 VM.

In the last step after you select which t-shirt you want you need to go to the code and do one small change. Even though the change was small it made me feel more inclined to hack through the code and modify it to do something else.



I can't wait to get the t-shirt and be one of the LINQ do the process if you want to know what i mean here.

Oh and the app has a lot of code that you will probably reuse too... another gift!



No comments:

Post a Comment

11.2.15

No more recursive functions to define CAML Queries thanks to Camlex

Some times you have a random number of conditions to check in a CAML query and in those cases I used to define the queries using a random recursive function that I usually have to debug a few times works perfectly on the first go.

The code for those queries would be something like this (and this is a simple one):

public List<string> GetSomeInfo(string fieldsToSearch, string contentTypesToSearch)
{
    ...

    var queryval = string.Empty;
    if (contentTypesToSearch.IsNullOrEmpty())
        queryval = string.Format("<Where>" + GenerateFieldsQuery(fieldsToSearch.Split(','), 0) + "</Where>", text);
    else
        queryval = string.Format("<Where><And>" + GenerateCTypesQuery(contentTypesToSearch.Split(','), 0) + GenerateFieldsQuery(fieldsToSearch.Split(','), 0) + "</And></Where>", text);

    var scope = "Scope=\"RecursiveAll\"";

    ...
}

private static string GenerateFieldsQuery(string[] fields, int index)
{
    if (fields.Length == 0) return string.Empty;

    if (fields.Length == index + 1)
        return "<Contains><FieldRef Name='" + fields[index] + "' /><Value Type='Text'>{0}</Value></Contains>";

    return "<Or><Contains><FieldRef Name='" + fields[index] + "' /><Value Type='Text'>{0}</Value></Contains>" + GenerateFieldsQuery(fields, ++index) + "</Or>";
}

private static string GenerateCTypesQuery(string[] cTypes, int index)
{
    if (cTypes.Length == 0) return string.Empty;

    if (cTypes.Length == index + 1)
        return "<Eq><FieldRef Name='ContentType' /><Value Type='Choice'>" + cTypes[index] + "</Value></Eq>";

    return "<Or><Eq><FieldRef Name='ContentType' /><Value Type='Choice'>" + cTypes[index] + "</Value></Eq>" + GenerateCTypesQuery(cTypes, ++index) + "</Or>";
}

That was until now... Thanks to Camlex (and thanks to Luis for showing it to me), that code can be written like this:

public List<string> GetSomeInfo(string fieldsToSearch, string contentTypesToSearch)
{
    ...

    var queryVal = string.Empty;
    var fieldExtensions = new List<Expression<Func<SPListItem, bool>>>();
    var cTypeExtensions = new List<Expression<Func<SPListItem, bool>>>();

    if (!contentTypesToSearch.IsNullOrEmpty())
    {
        foreach (var cType in contentTypesToSearch.Split(','))
            cTypeExtensions.Add(x => (string)x["ContentType"] == cType);
    }

    foreach (var field in fieldsToSearch.Split(','))
        fieldExtensions.Add(x => ((string)x[field]).Contains(text));

    var expressions = new List<Expression<Func<SPListItem, bool>>>();
    expressions.Add(ExpressionsHelper.CombineOr(cTypeExtensions));
    expressions.Add(ExpressionsHelper.CombineOr(fieldExtensions));

    queryVal = Camlex.Query().WhereAll(expressions).ToString();

    ...
}

I'll miss the recursive methods though... they made me feel special...

No comments:

Post a Comment

24.1.15

Get Cortana in Windows 10 Desktop Outside the US

One of the features I wanted to test the most on the new release of Windows 10 was Cortana but living outside the US it was disabled :(

Fear not, it will take seconds to have the system configured.

Look for Region and Language Settings in the search bar where Cortana should be and open it.

There change your location to United States and then click on Add a language. Of course the language you need to add is English (United States) then make English (United States) your primary language (you don't need to remove your other languages or keyboards).

Once your screen look like this:


Reset the machine and you'll have Cortana waiting for you there.


If you are anything like me you would be yelling "Hey Cortana" to the computer closer and closer to the mic. Don't. Before it works you need to configure it. Click on the search bar, look for the hamburguer icon and click on Settings:


And finally enable everything :)


And now say: "Hey Cortana, How are you doing?"


She's so polite...

I just need to ask her why the accent colour of my Windows is suddenly brown U_U

No comments:

Post a Comment

15.1.15

Win RT Universal app with a DocumentDB

NoSql databases have been brought to my attention a couple of thousand times in the last months and given that I am not the tidiest of the database designers and also that NoSql databases are supposed to be designed to escale I have decided to give them a go.

My platform of choice is Microsoft and looks like we have all the tools we need: Windows Apps that work the same desktops and phones and DocumentDB in Azure, fantastic, let's begin.

Let's start with the database, as it takes a while to start it will give you time to work in parallel with the visual studio in the meantime.

As I already have my azure account and everything set up I went straight to the creation of the database account: preview management portal DocumentDB Creation.

The process is explained in detail here: http://azure.microsoft.com/en-us/documentation/articles/documentdb-create-account/ 

First you need to specify a couple of parameters you won't see the page in Spanish necessarily, in fact I don't know why it is not showing it to me in English to set up the new database.



And once you are done you will be taken to a home page of the azure portal while you wait...


And while we wait we can go to the Visual Studio and start creating the projects we need. I usually say things like easy, piece of cake etc. but surely not so often today.

First of all I have created an universal app in Visual Studio 2013 as I am so outstandingly good with the user experience it will be the natural option for me to create two user interfaces, one for tablets and one for phones...


Now let's create a "Class Library (Portable for Universal Apps) to manage the DocumentDB connections and queries:


Once the DLL project was created I added a folder called Models with a class DbUser and a class Device:
namespace DocDbBridge.Models
{
    public class DbUser
    {
        public string Email { get; set; }
        public string Name { get; set; }
        public Device[] Devices { get; set; }
    }
}


namespace DocDbBridge.Models
{
    class Device
    {
        public string Name { get; set; }
        public string Brand { get; set; }
    }
}

Finally I went to the main class (which I called Connection) and added the following usings:


using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
using Microsoft.Azure.Documents.Linq;

It doesn't work because we are missing the Microsoft Azure DocumentDB Client Library 0.9.2-preview. In order to get it I have set the DLL to use .Net Framework 4.5.1


Then got Newtonsoft.Json from Nuget:


That creates a packages.config file in the project. In it I have added a line for the Microsoft.Azure.Documents.Client and rebuilt the project. The packages.config file looks like this:

<?xml version="1.0" encoding="utf-8"?>
<packages>
  <package id="Microsoft.Azure.Documents.Client" version="0.9.2-preview" targetFramework="portable-net451+win81+wpa81" />
  <package id="Newtonsoft.Json" version="6.0.8" targetFramework="portable-net451+win81+wpa81" />
</packages>

Finally, after the rebuild i have added a reference to the Microsoft.Azure.Documents.Client by browsing the project folders and finding the newly downloaded dll:


I have built the project again and it seems to be working, let's try to connect to the database now. Based in an example provided by Microsoft for the version 0.9.0 I have created a Connection class that goes like this:

using DocDbBridge.Models;
using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
using Microsoft.Azure.Documents.Linq;
using System;
using System.Linq;
using System.Threading.Tasks;

namespace DocDbBridge
{
    public class Connection : IDisposable
    {
        string endPoint = "https://chan.documents.azure.com:443/";
        string authKey = "Weird string with a lot of meaningless characters";

        DocumentClient client { get; set; }
        Database database { get; set; }

        DocumentCollection collection { get; set; }

        public Connection()
        {
             client = new DocumentClient(new Uri(endPoint), authKey);
             database = ReadOrCreateDatabase("QuickStarts");
             collection = ReadOrCreateCollection(database.SelfLink, "Documents");
             CreateDocuments(collection.SelfLink);
        }


        public DbUser QueryDocumentsLinq(string UserEmail)
        {
            // The .NET SDK for DocumentDB supports 3 different methods of Querying for Documents
            // LINQ queries, lamba and SQL


            //LINQ Lambda
            //return client.CreateDocumentQuery(collection.SelfLink).Where(u => u.Email == UserEmail).AsEnumerable().FirstOrDefault();
            return client.CreateDocumentQuery(collection.SelfLink).ToList().Where(u => u.Email == UserEmail).FirstOrDefault();
        }

        public DbUser QueryDocumentsSQL(string SqlQuery)
        {
            //3. SQL
            //var query = client.CreateDocumentQuery(collection.SelfLink, "SELECT * " +
            //                                                           "FROM UserDbs u " +
            //                                                           "WHERE u.email='andrew@stratex.com'");

            var query = client.CreateDocumentQuery(collection.SelfLink, SqlQuery);

            return query.AsEnumerable().FirstOrDefault();
        }

        public void Dispose()
        {
            Cleanup(database.SelfLink);
            client.Dispose();
        }

        #region Private Methods
        private Database ReadOrCreateDatabase(string databaseId)
        {
            // Most times you won't need to create the Database in code, someone has likely created
            // the Database already in the Azure Management Portal, but you still need a reference to the
            // Database object so that you can work with it. Therefore this first query should return a record
            // the majority of the time

            Database db = client.CreateDatabaseQuery().ToList().Where(d => d.Id == databaseId).FirstOrDefault();

                //db = client.CreateDatabaseQuery()
                //                .Where(d => d.Id == databaseId)
                //                .AsEnumerable()
                //                .FirstOrDefault();

            // In case there was no database matching, go ahead and create it. 
            if (db == null)
            {
                //Console.WriteLine("2. Database not found, creating");
                db = client.CreateDatabaseAsync(new Database { Id = databaseId }).Result;
            }

            return db;
        }

        private DocumentCollection ReadOrCreateCollection(string databaseLink, string collectionId)
        {
            DocumentCollection col = client.CreateDocumentCollectionQuery(databaseLink).ToList().Where(c => c.Id == collectionId).FirstOrDefault(); ;

                //col = client.CreateDocumentCollectionQuery(databaseLink)
                //                .Where(c => c.Id == collectionId)
                //                .AsEnumerable()
                //                .FirstOrDefault();

            // For this sample, if we found a DocumentCollection matching our criteria we are simply deleting the collection
            // and then recreating it. This is the easiest way to clear out existing documents that might be left over in a collection
            //
            // NOTE: This is not the expected behavior for a production application. 
            // You would likely do the same as with a Database previously. If found, then return, else create
            if (col != null)
            {
                //Console.WriteLine("3. Found DocumentCollection.\n3. Deleting DocumentCollection.");
                client.DeleteDocumentCollectionAsync(col.SelfLink).Wait();
            }

            //Console.WriteLine("3. Creating DocumentCollection");
            return client.CreateDocumentCollectionAsync(databaseLink, new DocumentCollection { Id = collectionId }).Result;
        }

        private void CreateDocuments(string collectionLink)
        {
            // DocumentDB provides many different ways of working with documents. 
            // 1. You can create an object that extends the Document base class
            // 2. You can use any POCO whether as it is without extending the Document base class
            // 3. You can use dynamic types
            // 4. You can even work with Streams directly.
            //
            // This sample method demonstrates only the first example
            // For more examples of other ways to work with documents please consult the samples on MSDN. 

            // Work with a well defined type that extends Document
            // In DocumetnDB every Document must have an "id" property. If you supply one, it must be unique. 
            // If you do not supply one, DocumentDB will generate a unique value for you and add it to the Document. 
            var task1 = client.CreateDocumentAsync(collectionLink, new DbUser
            {
                Email = "chan@stratex.com",
                Name = "Test user",
                Devices = new Device[]
                {
                    new Device { Name="Lumia 920", Brand="Nokia"},
                    new Device { Name="Surface 3", Brand="Microsoft"},
                }
             });

            var task2 = client.CreateDocumentAsync(collectionLink, new DbUser
            {
                Email = "andrew@stratex.com",
                Name = "Andrew",
                Devices = new Device[]
                {
                    new Device { Name="Lumia 925", Brand="Nokia"},
                    new Device { Name="Surface 3", Brand="Microsoft"},
                }
            });

            
            // Wait for the above Async operations to finish executing
            Task.WaitAll(task1, task2);
        }

        private void Cleanup(string databaseId)
        {
            client.DeleteDatabaseAsync(databaseId).Wait();
        }
        #endregion
    }
}

As you might have noticed you also need the URL and the Auth Key, you can get them from the azure portal which by now will probably have your database up and running:



After that I have added a reference to the DocDbBridge DLL in my app project:


I have, after that, executed my app, by the way, this is the code... Impressive


        private void Button_Click(object sender, RoutedEventArgs e)
        {
            try
            {
                using (Connection conn = new Connection())
                {
                    var user = conn.QueryDocumentsLinq("andrew@stratex.com");

                    if (user != null)
                        Result.Text = user.Name;
                    else
                        Result.Text = "User not found.";
                }
            }
            catch (Exception ex)
            {
                Result.Text = "Exception found";
            }
        }

And it has failed miserably because it needs again the Newtonsoft.Json package... I have installed it on the W8.1 app project.

After that I executed the project again, clicked my marvellous button and I got Exception: Microsoft.Azure.Documents.BadRequestException: Syntax error, unexpected end-of-file here:



client.CreateDatabaseQuery()
                                .Where(d => d.Id == databaseId)
                                .AsEnumerable()
                                .FirstOrDefault();

//And here:
client.CreateDocumentCollectionQuery(databaseLink)
                                .Where(c => c.Id == collectionId)
                                .AsEnumerable()
                                .FirstOrDefault();

It looks like there's something not working quite right in this release... I have changed the line to: 

Database db = client.CreateDatabaseQuery().ToList().Where(d => d.Id == databaseId).FirstOrDefault();

This is a huge issue because it means that you need to retrieve all of them first and then query the database... Completely Unusable! :( Could it be because I am using a piece of software that is in preview in a platform that it's not supported?... some would say yes...

Again in the Linq query the same issue, I had to change from: 

return client.CreateDocumentQuery<DbUser>(collection.SelfLink).Where(u => u.Email == UserEmail).AsEnumerable().FirstOrDefault();

To

return client.CreateDocumentQuery<DbUser>(collection.SelfLink).ToList().Where(u => u.Email == UserEmail).FirstOrDefault();

And I got again the same error when tried to execute the queries with SQL.

Wrapping up:
Upsides: It kind of works and that's really cool.
Downsides: What is a database if you cannot query? I would say... Not Ideal. 

But then again this is a preview software in an unsupported platform.... And it works more or less. Just imagine how cool would the software be in a couple of iterations!

No comments:

Post a Comment