Social Icons

twitter google plus linkedin rss feed

Pages

Showing posts with label Azure. Show all posts
Showing posts with label Azure. Show all posts

22.9.15

How To Clone a Virtual Machine in Azure

This is quite simple, once you know how to do it but... when you don't...

My favourite way of cloning virtual machines has always been copying the hard disks and deploying the virtual machine in a different network segment. We do have all the ingredients we need for this recipe in Azure so let's get to it!

You will need:

  • The blob that contains the vhd you want to clone
  • A new Cloud Service (or one where the source virtual machine is not running)
  • Notepad.exe (or similar, although nothing beats the original)
  • Azure PowerShell
  • Attention to detail
  • Just a bit of patience
Let's start.

The first thing you need to do is to find out the disk(s) you want to clone. In the virtual machine's dashboard page, if you scroll down a bit, you will be able to see them.

After you scroll down do not forget to scroll right too, the URL we need is a bit hidden... copy it in your notepad.

This is the address of the blob that contains the disk and in it you can find:

http://[yourStorageAccount].blob.core.windows.net/[yourContainerName]/[yourBlobName].vhd

Finally you need  the primary access key for your storage account. 

Go to Storage, select your Storage Account and look on the footer of the page for the key icon:


There copy your primary key and paste it to your notepad.

And that's all you need.

#From http://michaelwasham.com/windows-azure-powershell-reference-guide/copying-vhds-blobs-between-storage-accounts/
#And after that from http://www.codegrimoire.com
####################################################################################################################
### Source VHD - anonymous access container ###
$storageUri = "http://yourStorageAccount.blob.core.windows.net/"
$containerName = "yourContainerName" ##I am using the same source and destination here
$sourceBlobName = "yourBlobame.vhd"

### Destination Blob
$destBlobName = "newBlobName.vhd"

######### New Disk Name
$newDiskName =  "newDiskName"
$newDiskLabel = "BootDisk" ##In the documentation it is either BootDisk or DataDisk but you can call it something else
$isOsDisk = $true ##$true Or $false. As I only work with windows I will not bother about other OSs in this script

### Target Storage Account ###
$storageAccount = "yourStorageAccount"
$storageKey = "yourPrimaryKey" ##Primary Access Key


###################################################################################
############  Automated script for creating the new VHD  ##########################
###################################################################################

$srcUri = ($storageUri.trim('/'), $containerName.trim('/'), $sourceBlobName) -join '/'
$destUri = ($storageUri.trim('/'), $containerName.trim('/'), $destBlobName) -join '/'
 
### Create the destination context for authenticating the copy
$destContext = New-AzureStorageContext  –StorageAccountName $storageAccount -StorageAccountKey $storageKey  
 
### Create the target container in storage
### Not necessary as i am using an existing one ### New-AzureStorageContainer -Name $containerName -Context $destContext 
 
### Start the Asynchronous Copy ###
$blob1 = Start-AzureStorageBlobCopy -srcUri $srcUri -DestContainer $containerName -DestBlob $destBlobName -DestContext $destContext

### Loop until complete ###                                    
Do{
  $status = $blob1 | Get-AzureStorageBlobCopyState 
  ### Print out status ###
  $status.Status
  Start-Sleep 5
}While($status.Status -eq "Pending") ##This doesn't work as you would expect but the idea is good and maybe they will change the way Get-AzureStorageBlobCopyState works :)


######## After the new blob has been created we will add the new disk #############
if ($isOsDisk){
    Add-AzureDisk -DiskName $newDiskName -MediaLocation $destUri -Label $newDiskLabel -OS "Windows"
}
else{
    Add-AzureDisk -DiskName $newDiskName -MediaLocation $destUri -Label $newDiskLabel
}

Once you edit the script with your data and run it you will have to wait for a couple of minutes before the new disk is available. You do not need to turn off the source virtual machine although it's better safe than sorry

After that go to Virtual Machines, New, Compute, Virtual Machine, From Gallery and choose My disks in the lower part of the left column:

The disk you have just created will appear in the list and after that you just need to create the new virtual machine normally.

By the way, we need an Azure + SharePoint administrator in London, do you oblige?


Is it a good idea to call the cloned VMs Dolly?

No comments:

Post a Comment

15.1.15

Win RT Universal app with a DocumentDB

NoSql databases have been brought to my attention a couple of thousand times in the last months and given that I am not the tidiest of the database designers and also that NoSql databases are supposed to be designed to escale I have decided to give them a go.

My platform of choice is Microsoft and looks like we have all the tools we need: Windows Apps that work the same desktops and phones and DocumentDB in Azure, fantastic, let's begin.

Let's start with the database, as it takes a while to start it will give you time to work in parallel with the visual studio in the meantime.

As I already have my azure account and everything set up I went straight to the creation of the database account: preview management portal DocumentDB Creation.

The process is explained in detail here: http://azure.microsoft.com/en-us/documentation/articles/documentdb-create-account/ 

First you need to specify a couple of parameters you won't see the page in Spanish necessarily, in fact I don't know why it is not showing it to me in English to set up the new database.



And once you are done you will be taken to a home page of the azure portal while you wait...


And while we wait we can go to the Visual Studio and start creating the projects we need. I usually say things like easy, piece of cake etc. but surely not so often today.

First of all I have created an universal app in Visual Studio 2013 as I am so outstandingly good with the user experience it will be the natural option for me to create two user interfaces, one for tablets and one for phones...


Now let's create a "Class Library (Portable for Universal Apps) to manage the DocumentDB connections and queries:


Once the DLL project was created I added a folder called Models with a class DbUser and a class Device:
namespace DocDbBridge.Models
{
    public class DbUser
    {
        public string Email { get; set; }
        public string Name { get; set; }
        public Device[] Devices { get; set; }
    }
}


namespace DocDbBridge.Models
{
    class Device
    {
        public string Name { get; set; }
        public string Brand { get; set; }
    }
}

Finally I went to the main class (which I called Connection) and added the following usings:


using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
using Microsoft.Azure.Documents.Linq;

It doesn't work because we are missing the Microsoft Azure DocumentDB Client Library 0.9.2-preview. In order to get it I have set the DLL to use .Net Framework 4.5.1


Then got Newtonsoft.Json from Nuget:


That creates a packages.config file in the project. In it I have added a line for the Microsoft.Azure.Documents.Client and rebuilt the project. The packages.config file looks like this:

<?xml version="1.0" encoding="utf-8"?>
<packages>
  <package id="Microsoft.Azure.Documents.Client" version="0.9.2-preview" targetFramework="portable-net451+win81+wpa81" />
  <package id="Newtonsoft.Json" version="6.0.8" targetFramework="portable-net451+win81+wpa81" />
</packages>

Finally, after the rebuild i have added a reference to the Microsoft.Azure.Documents.Client by browsing the project folders and finding the newly downloaded dll:


I have built the project again and it seems to be working, let's try to connect to the database now. Based in an example provided by Microsoft for the version 0.9.0 I have created a Connection class that goes like this:

using DocDbBridge.Models;
using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
using Microsoft.Azure.Documents.Linq;
using System;
using System.Linq;
using System.Threading.Tasks;

namespace DocDbBridge
{
    public class Connection : IDisposable
    {
        string endPoint = "https://chan.documents.azure.com:443/";
        string authKey = "Weird string with a lot of meaningless characters";

        DocumentClient client { get; set; }
        Database database { get; set; }

        DocumentCollection collection { get; set; }

        public Connection()
        {
             client = new DocumentClient(new Uri(endPoint), authKey);
             database = ReadOrCreateDatabase("QuickStarts");
             collection = ReadOrCreateCollection(database.SelfLink, "Documents");
             CreateDocuments(collection.SelfLink);
        }


        public DbUser QueryDocumentsLinq(string UserEmail)
        {
            // The .NET SDK for DocumentDB supports 3 different methods of Querying for Documents
            // LINQ queries, lamba and SQL


            //LINQ Lambda
            //return client.CreateDocumentQuery(collection.SelfLink).Where(u => u.Email == UserEmail).AsEnumerable().FirstOrDefault();
            return client.CreateDocumentQuery(collection.SelfLink).ToList().Where(u => u.Email == UserEmail).FirstOrDefault();
        }

        public DbUser QueryDocumentsSQL(string SqlQuery)
        {
            //3. SQL
            //var query = client.CreateDocumentQuery(collection.SelfLink, "SELECT * " +
            //                                                           "FROM UserDbs u " +
            //                                                           "WHERE u.email='andrew@stratex.com'");

            var query = client.CreateDocumentQuery(collection.SelfLink, SqlQuery);

            return query.AsEnumerable().FirstOrDefault();
        }

        public void Dispose()
        {
            Cleanup(database.SelfLink);
            client.Dispose();
        }

        #region Private Methods
        private Database ReadOrCreateDatabase(string databaseId)
        {
            // Most times you won't need to create the Database in code, someone has likely created
            // the Database already in the Azure Management Portal, but you still need a reference to the
            // Database object so that you can work with it. Therefore this first query should return a record
            // the majority of the time

            Database db = client.CreateDatabaseQuery().ToList().Where(d => d.Id == databaseId).FirstOrDefault();

                //db = client.CreateDatabaseQuery()
                //                .Where(d => d.Id == databaseId)
                //                .AsEnumerable()
                //                .FirstOrDefault();

            // In case there was no database matching, go ahead and create it. 
            if (db == null)
            {
                //Console.WriteLine("2. Database not found, creating");
                db = client.CreateDatabaseAsync(new Database { Id = databaseId }).Result;
            }

            return db;
        }

        private DocumentCollection ReadOrCreateCollection(string databaseLink, string collectionId)
        {
            DocumentCollection col = client.CreateDocumentCollectionQuery(databaseLink).ToList().Where(c => c.Id == collectionId).FirstOrDefault(); ;

                //col = client.CreateDocumentCollectionQuery(databaseLink)
                //                .Where(c => c.Id == collectionId)
                //                .AsEnumerable()
                //                .FirstOrDefault();

            // For this sample, if we found a DocumentCollection matching our criteria we are simply deleting the collection
            // and then recreating it. This is the easiest way to clear out existing documents that might be left over in a collection
            //
            // NOTE: This is not the expected behavior for a production application. 
            // You would likely do the same as with a Database previously. If found, then return, else create
            if (col != null)
            {
                //Console.WriteLine("3. Found DocumentCollection.\n3. Deleting DocumentCollection.");
                client.DeleteDocumentCollectionAsync(col.SelfLink).Wait();
            }

            //Console.WriteLine("3. Creating DocumentCollection");
            return client.CreateDocumentCollectionAsync(databaseLink, new DocumentCollection { Id = collectionId }).Result;
        }

        private void CreateDocuments(string collectionLink)
        {
            // DocumentDB provides many different ways of working with documents. 
            // 1. You can create an object that extends the Document base class
            // 2. You can use any POCO whether as it is without extending the Document base class
            // 3. You can use dynamic types
            // 4. You can even work with Streams directly.
            //
            // This sample method demonstrates only the first example
            // For more examples of other ways to work with documents please consult the samples on MSDN. 

            // Work with a well defined type that extends Document
            // In DocumetnDB every Document must have an "id" property. If you supply one, it must be unique. 
            // If you do not supply one, DocumentDB will generate a unique value for you and add it to the Document. 
            var task1 = client.CreateDocumentAsync(collectionLink, new DbUser
            {
                Email = "chan@stratex.com",
                Name = "Test user",
                Devices = new Device[]
                {
                    new Device { Name="Lumia 920", Brand="Nokia"},
                    new Device { Name="Surface 3", Brand="Microsoft"},
                }
             });

            var task2 = client.CreateDocumentAsync(collectionLink, new DbUser
            {
                Email = "andrew@stratex.com",
                Name = "Andrew",
                Devices = new Device[]
                {
                    new Device { Name="Lumia 925", Brand="Nokia"},
                    new Device { Name="Surface 3", Brand="Microsoft"},
                }
            });

            
            // Wait for the above Async operations to finish executing
            Task.WaitAll(task1, task2);
        }

        private void Cleanup(string databaseId)
        {
            client.DeleteDatabaseAsync(databaseId).Wait();
        }
        #endregion
    }
}

As you might have noticed you also need the URL and the Auth Key, you can get them from the azure portal which by now will probably have your database up and running:



After that I have added a reference to the DocDbBridge DLL in my app project:


I have, after that, executed my app, by the way, this is the code... Impressive


        private void Button_Click(object sender, RoutedEventArgs e)
        {
            try
            {
                using (Connection conn = new Connection())
                {
                    var user = conn.QueryDocumentsLinq("andrew@stratex.com");

                    if (user != null)
                        Result.Text = user.Name;
                    else
                        Result.Text = "User not found.";
                }
            }
            catch (Exception ex)
            {
                Result.Text = "Exception found";
            }
        }

And it has failed miserably because it needs again the Newtonsoft.Json package... I have installed it on the W8.1 app project.

After that I executed the project again, clicked my marvellous button and I got Exception: Microsoft.Azure.Documents.BadRequestException: Syntax error, unexpected end-of-file here:



client.CreateDatabaseQuery()
                                .Where(d => d.Id == databaseId)
                                .AsEnumerable()
                                .FirstOrDefault();

//And here:
client.CreateDocumentCollectionQuery(databaseLink)
                                .Where(c => c.Id == collectionId)
                                .AsEnumerable()
                                .FirstOrDefault();

It looks like there's something not working quite right in this release... I have changed the line to: 

Database db = client.CreateDatabaseQuery().ToList().Where(d => d.Id == databaseId).FirstOrDefault();

This is a huge issue because it means that you need to retrieve all of them first and then query the database... Completely Unusable! :( Could it be because I am using a piece of software that is in preview in a platform that it's not supported?... some would say yes...

Again in the Linq query the same issue, I had to change from: 

return client.CreateDocumentQuery<DbUser>(collection.SelfLink).Where(u => u.Email == UserEmail).AsEnumerable().FirstOrDefault();

To

return client.CreateDocumentQuery<DbUser>(collection.SelfLink).ToList().Where(u => u.Email == UserEmail).FirstOrDefault();

And I got again the same error when tried to execute the queries with SQL.

Wrapping up:
Upsides: It kind of works and that's really cool.
Downsides: What is a database if you cannot query? I would say... Not Ideal. 

But then again this is a preview software in an unsupported platform.... And it works more or less. Just imagine how cool would the software be in a couple of iterations!

No comments:

Post a Comment

27.8.14

Cannot access my Azure VM (And how I solved it)

Cannot access my Azure VM (And how I solved it) I was creating a new firewall rule in one of my azure VMs and I went a tad greedy when it came to blocking IPs... I blocked ALL the traffic.

What to do?

  • My first thought was to download the VM, make the change locally and upload it again... but it's 250GB.
  • I tried changing the firewall rule through powershell from the azure subscription but Get-AzureWinRMUri was not working.
  • Finally after asking a friend who asked a friend of his I was told it was possible to edit the registry of a windows OS disk form a different windows. Wow! That was the answer!


And, in fact, once I knew it could be done I found a detailed post about how to do it.

So I deleted my beloved VM, attached the VHD to another VM and from the registry I deleted the firewall rule that was blocking all the traffic.

Then I detached the VHD and created a new VM with it and ¡listo! we had the VM up and running and accessible.

No comments:

Post a Comment

29.7.14

Azure VM Snapshots Even Simpler

I have been thinking about taking and restoring snapshots of my Azure Virtual Machines for a while but never had the time to do it... until today.

It is extremely easy when you know how to do it but it could be a bit daunting when you have to start from scratch and as I have to write a post explaining it and it was about time for me to write another post in my blog I thought it could be a good idea to do it here.

From the beginning:
Luckily by now you will have it working.

Now the real thing. Chris Clayton has created a set of scripts that work wonderfully for managing Azure VM Screenshots and that is what I am using right now. If you are too lazy to go to the page and find the link you can get the scripts from here.
The only thing you need to configure is the Subscriptions.csv file. It's a CSV file so in the first line you have the column names and in the rest of them you have the data. It looks like this:
SubscriptionName,SubscriptionId,CertificateThumbprint
"IT","XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX","XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
This file will make the executing the commands easier as it has all the information about the subscription(s) that you are going to use. Yes, cool but... where to I get the data? look.

First, to get all the information we need about our subscription in PowerShell we can run:
Get-AzureSubscription
That will give us a screen like this one:
 Those are the parameters you need to fill in. Simple stuff.

Finally in order to run the commands you will have to provide data about which subscription to use and inside it which virtual machine. Simple stuff, we get the list of virtual machines we have in our subscription executing Get-AzureVM and we will get a screen like this one:


Now we have all the parameters we need. Just for testing it we will launch the script for getting the list of snapshots in a virtual machine.


./GetSnapshotList.ps1 -subscriptionName "NameOfTheSubscription" -cloudServiceName "ServiceName" -virtualMachineName "Name" -maximumDays 15
Where:
  • -subscriptionName is the SubscriptionName you got from Get-AzureSubscription and then wrote in Subscriptions.csv
  • -cloudServiceName is the ServiceName column you can see at Get-AzureVM
  • -virtualMachineName is the Name column you can see at Get-AzureVM

And that's it.

Things to remember:
  • The VM should be shut down before taking the snapshot (there's a handy parameter for that in the PowerShell command)
  • The data in your "Temporary Storage" drives will not be backed up or restored because they are temporary. Yes I knew it was going to work that way but I had to try.

As a test I took a snapshot of one of my SharePoint development machines, made some changes, restored it and so how the changes were reverted, just as expected. And I would even say faster than I expected.

No comments:

Post a Comment

9.8.13

The process wlms.exe has initiated the power off of computer

I was getting random shutdowns of one of my virtual machines hosted in azure... I thought it was something related to azure but no.

I had a look to the shutdown log and found out this:

Log Name:      System
Source:        USER32
Date:          08/08/2013 17:39:04
Event ID:      1074
Task Category: None
Level:         Information
Keywords:      Classic
User:          SYSTEM
Computer:      COMPUTERNAME.local
Description:
The process wlms.exe has initiated the power off of computer COMPUTERNAME on behalf of user NT AUTHORITY\SYSTEM for the following reason: Other (Unplanned)
 Reason Code: 0x0
 Shutdown Type: power off
 Comment: 
Event Xml:
<event xmlns="http://schemas.microsoft.com/win/2004/08/events/event">
  <system>
    <provider name="USER32">
    <eventid qualifiers="32768">1074</eventid>
    <level>4</level>
    <task>0</task>
    <keywords>0x80000000000000</keywords>
    <timecreated systemtime="2013-08-08T16:39:04.000000000Z">
    <eventrecordid>147067</eventrecordid>
    <channel>System</channel>
    <computer>COMPUTERNAME.local</computer>
    <security userid="S-1-5-18">
  </security></timecreated></provider></system>
  <eventdata>
    <data>wlms.exe</data>
    <data>COMPUTERNAME</data>
    <data>Other (Unplanned)</data>
    <data>0x0</data>
    <data>power off</data>
    <data>
    </data>
    <data>NT AUTHORITY\SYSTEM</data>
  </eventdata>
</event>

A fast googling told me that in order to stop the VM from shutting down I needed to activate it :)

No comments:

Post a Comment

17.4.13

Windows Azure IaaS Virtual Machines prices are competitive, but not cheap

As you would probably know Microsoft has made available to everyone the new Windows Azure Virtual Machines and Virtual Network. You can see more detail on the matter here.

Everything is fine but I still don't like the pricing...

The screenshots look great (I haven't had time to test it lately) and they have included more templates, including one trial VM for SharePoint 2013 that will expire on October this year.

I put the trial in bold because who would pay $270 a month for testing in a VM that will probably self destruct in six months?

The greatness of IaaS is that anyone can create anything without having to spend a lot of money in hardware, I see that. But in the other hand, if the pricing of the platform is so close to what the traditional hosting already offers... What's the point?

A fast search will show you that you can get a 4 Core 2GB (RAM) and 500GB (HDD) for £79 a month, why paying £85 for a similar machine hosted in the cloud?

The main upside is in Azure you can create VMs and delete them in a matter of minutes and in a traditional vendor you have to phone them and normally tell them a month in advance that you want to terminate the service.

If you want to create a farm and test how to connect everything you can do it, configure it, play around a bit and then delete everything for something like $1 an hour, that's good. But if you want something more stable the differences between Azura and a traditional hosting are not so big.

Needless to say the technology is amazing, built in load balancer, easy creation of virtual networks, new templates and the VMs with 56GB of RAM.

And needless to say I am very excited about it and can't wait to put my hands on it.

But I still see it a bit pricey.

No comments:

Post a Comment

25.2.12

CrossDomain SecurityException accessing my Azure WCF Service from Silverlight

Just uploaded a WCF Service Web Role to Azure with the intention of accessing it from a Silverlight web part, but I got the dreaded and almost expected:
System.ServiceModel.CommunicationException: An error occurred while trying to make a request to URI 'http://localhost:81/MyService.svc'. This could be due to attempting to access a service in a cross-domain way without a proper cross-domain policy in place, or a policy that is unsuitable for SOAP services. You may need to contact the owner of the service to publish a cross-domain policy file and to ensure it allows SOAP-related HTTP headers to be sent. This error may also be caused by using internal types in the web service proxy without using the InternalsVisibleToAttribute attribute. Please see the inner exception for more details. ---> System.Security.SecurityException ---> System.Security.SecurityException: Security error.
Yes, cool, but how do I add my crossdomain.xml or my clientaccesspolicy.xml files to an “Azure Web Role” which I just heard exists?

To my surprise it’s very easy you just have to add the XMLs to the project:

adding crossdomain.xml and clientaccesspolicy.xml to azure web role

Then you’ll be able to debug and, after you “Upgrade” the package in Azure Hosted service, you’ll be able to access it from Silverlight.

By the way, I don’t like "Upgrade" here, I find it misleading. I would have used Update.

No comments:

Post a Comment

1.2.11

Not SharePoint nor SQL Servers are working on the Azure VM Role

As you know, if you read me, I have been trying to be accepted in the Azure VM Role beta program, well, I have finally got a meeting with the guys that could grant me access and they've asked me the most feared question... What do you want it for?

My idea was to set up a farm in the cloud with SQL and SharePoint so I could use SSIS, our BI cube, Reporting services and SharePoint.

Well, this Microsoft guys say that "At the moment that's impossible"... Looks like there are connectivity problems between the machines in VM Role so not SharePoint nor SQL are working properly.

The advice they have for me is to set everything up on Office 365. But, what happens to my BI cube and my reporting services if I can't use web services from the Office 365's sandbox? Hmmm they won't work, so I'll just have half of my solution working and the other half sadly forgotten in the recesses of my hard disk.

The solution they provide... wait until summer.

I'll try to convince my boss to give me six months off, but if this doesn't work I am going to be forced to look for another place to set up my farm on.

Oh and, in case someone doubts it, I wasn't granted the access to the beta program.

And in the meantime Google is giving away laptops for the developers to test their new Chrome OS...

No comments:

Post a Comment

28.1.11

Uploading Machines to Azure VM Role

As all the technology we use at Stratex Systems is from Microsoft it looks simple logic to host our virtual machines in the Azure cloud... Beta!, now I'm scared. There's not too much information available about this at the moment, so I'll try to write about my experience.

The first thing I want to tell to anyone who is trying to test the VM Role is that you need to be accepted in the beta program. There are a lot of web pages including several from Microsoft that say that you'll only need to sing up and that's enough to start working, but it's not true.

If you don't know what's this all about, Azure VM Role is a new system to sell IaaS (Infrastructure as a Service) where you can upload your Hyper-V hard disks after installing all the software you need and they'll bring them to live. After that, they'll charge you for the storage, processor number and the amount of RAM you use (per hour).

Well, to say the truth, I can't talk much more about this topic because I'm still waiting to be accepted in the beta program...



Now I feel like a 15 year old boy, a lot of theory, but very little practice.

No comments:

Post a Comment