Social Icons

twitter google plus linkedin rss feed

Pages

Showing posts with label C#. Show all posts
Showing posts with label C#. Show all posts

28.5.20

Synchronous and Asychronous ThreadSafe Blitzkrieg Caching

Update! You can download BlitzCache from nuget now


Caching is necessary
Over the years I have used a lot of caching. In fact, I consider that some things like user permissions should normally have a cache of at least one minute.

Blitzkrieg Caching
Even when a method is cached there are cases when it is called again before it has finished the first time and this results in a new request to the database, and this time much slower. This is what I call The Blitzkrieg Scenario.

The slowest the query the more probabilities you have for this to happen and the worse the impact. I have seen too many times SQL Server freeze in the struggle of replying to the exact same query while the query is already being executed...

Ideally at least in my mind a cached method should only calculate its value once per cache period. To achieve this we could use a lock... But if I am caching different calls I want more than one call to be executed at the same time, exactly one time per cache key in parallel. This is why I created the LockDictionary class.

The LockDictionary
Instead of having a lock in my cache service that will lock all the parallel calls indiscriminately I have a dictionary of locks to lock by cache key.

public static class LockDictionary
{
    private static readonly object dictionaryLock = new object();
    private static readonly Dictionary<string, object> locks = new Dictionary<string, object>();

    public static object Get(string key)
    {
        if (!locks.ContainsKey(key))
        {
            lock (dictionaryLock)
            {
                if (!locks.ContainsKey(key)) locks.Add(key, new object());
            }
        }

        return locks[key];
    }
}
With this I can very easily select what I want to lock

GetBlitzkriegLocking
Now I can check if something is cached and return it or lock that call in particular and calculate the value of the function passed as a parameter.

public class CacheService : ICacheService
{
  private readonly IMemoryCache memoryCache;

  public CacheService(IMemoryCache memoryCache)
  {
      this.memoryCache = memoryCache;
  }

  public T GetBlitzkriegLocking<T>(string cacheKey, Func<T> function, double milliseconds)
  {
      if (memoryCache.TryGetValue(cacheKey, out T result)) return result;
      lock (LockDictionary.Get(cacheKey))
      {
          if (memoryCache.TryGetValue(cacheKey, out result)) return result;

          result = function.Invoke();
          memoryCache.Set(cacheKey, result, DateTime.Now.AddMilliseconds(milliseconds));
      }

      return result;
  }
}
And how do I use it?
var completionInfo = cacheService.GetBlitzkriegLocking($"CompletionInfo-{legalEntityDto.Id}", () => GetCompletionInfoDictionary(legalEntityDto), 500));
//Look ma, I am caching this for just 500 milliseconds and it really makes a difference
I find this method extremely useful but sometimes the function I am calling needs to be awaited... And you Cannot await in the body of a lock statement. What do I do?

The SemaphoreDictionary
Semaphores do allow you to await whatever you need, in fact they themselves are awaitable. If we translate the LockDictionary class to use semaphores it looks like this:

public static class SemaphoreDictionary
{
    private static readonly object dictionaryLock = new object();
    private static Dictionary<string, SemaphoreSlim> locks = new Dictionary<string, SemaphoreSlim>();

    public static SemaphoreSlim Get(string key)
    {
        if (!locks.ContainsKey(key))
        {
            lock (dictionaryLock)
            {
                if (!locks.ContainsKey(key)) locks.Add(key, new SemaphoreSlim(1,1));
            }
        }

        return locks[key];
    }
}
And using this I can await calls while I am locking stuff.


The Awaitable GetBlitzkriegLocking

The main rule about semaphores is that you must make sure you release them or they will be locked forever. Catching the error is optional though.
public async Task<T> GetBlitzkriegLocking<T>(string cacheKey, Func<Task<T>> function, double milliseconds)
{
    if (memoryCache.TryGetValue(cacheKey, out T result)) return result;

    var semaphore = SemaphoreDictionary.Get(cacheKey);

    try
    {
        await semaphore.WaitAsync();
        if (!memoryCache.TryGetValue(cacheKey, out result))
        {
            result = await function.Invoke();
            memoryCache.Set(cacheKey, result, DateTime.Now.AddMilliseconds(milliseconds));
        }
    }
    finally
    {
        semaphore.Release();
    }

    return result;
}
And how do I use it?
await cache.GetBlitzkriegLocking($"RequestPermissions-{userSid}-{workplace}", () => RequestPermissionsAsync(userSid, workplace), 60 * 1000);
Please give these methods a try and let me know how you would improve them. I am using them every now and then and I really enjoy how they simplify the code. I hope you like them too.

No comments:

Post a Comment

11.6.15

Migrating SharePoint Users to a New Domain

I have been dreading this type of migration for years... so many years that I already had planned a couple of ways of solving the issue. It finally happened.

Scenario:

Someone decides we need to change the farm from one environment to a new one completely different with a new AD and in a new city.

Well, let's get to it. We have created a new SharePoint farm in the new environment and we have backed up and restored the content databases. We have manually changed the admin of the site collection to the new admin in the new AD in the new farm and we can access the site and see the data. Fantastic.

Fantastic?

The users in the list items are the users from the old farm. And we have several lists with a lot of user fields. And some of our lists have tens or hundreds of thousands of rows. Changing them manually is not an option.

First idea: Go refined and try stsadm -o migrateuser:

Ohh so easy... we change the login name of the user to something else and we are good because the user IDs are still the same... NO.

This is a new domain and we don't have access to the old domain users so the migrateuser parameter throws a nice User not found error.

Second idea: Go berserk and change the strings in the list items

And that worked. Oh the beauty of a simple idea. The process is pure brute force... beautiful in its barbarity... If you have read up to here you are probably desperate for a solution.

Step One:
Get all of the users from the old farm in an XML file or something really high tech (a csv could work too).

static void Main(string[] args)
{
    using (SPSite site = new SPSite(args[0]))
    {
        using (SPWeb web = site.OpenWeb())
        {
            XElement users = new XElement("Users");

            foreach (SPUser user in web.SiteUsers)
            {
                XElement xmlUser = new XElement("User");
                xmlUser.Add(new XAttribute("Name", user.Name));
                xmlUser.Add(new XAttribute("LoginName", user.LoginName));
                xmlUser.Add(new XAttribute("ID", user.ID));

                users.Add(xmlUser);
            }

            users.Save("SiteUsers.xml");
        }
   
    }

    Console.WriteLine("\nProcess finished...");
    Console.ReadLine();
}


Step Two:
Make sure all the users you need are in the new AD. As you have a list in XML you can pass it to someone with privileges in the AD.

Step Three:
Ensure the users in SharePoint, add them to a group with reading permissions and then iterate through all the items in the list changing the users from the old domain to the users in the new one.

static void Main(string[] args)
{
    XElement users = XElement.Load("SiteUsers.xml");
    string newDomain = "XXXXXXXX";

    string ListName = string.Empty;
    if (args.Length == 2) ListName = "Stratex Framework";
    else ListName = args[2];

    //Args are SiteUrl VisitorsGroup ListName
    using (SPSite site = new SPSite(args[0]))
    {
        using (SPWeb web = site.OpenWeb())
        {
            SPList listToUpdate = web.Lists[ListName];

            Dictionary<string, SPUser> NewUsers = new Dictionary<string, SPUser>();

            foreach (XElement user in users.Descendants("User"))
            {
                string LoginName = FixDomain(user.Attribute("LoginName").Value, newDomain);
                SPUser spUser = null;
                try
                {
                    //We try to ensure all the users from the XML file. We'll probably need them
                    spUser = web.EnsureUser(LoginName);
                }
                catch
                { Logger.WriteLine("The user {0} could not be found.", LoginName); }

                if (spUser != null)
                {
                    SPGroup viewers = web.Groups[args[1]];

                    viewers.AddUser(spUser);
                    //Finally we add them to a group with read permissions
                    //We can worry about restricting this further after the migration

                    if (!NewUsers.ContainsKey(LoginName)) NewUsers.Add(user.Attribute("ID").Value, spUser);
                }
            }

            web.Update();


            UpdateUsersInList(listToUpdate, NewUsers);
        }

    }

    Logger.WriteLine("\nProcess finished...");
    Console.ReadLine();
}

private static void UpdateUsersInList(SPList list, Dictionary<string, SPUser> NewUsers)
{
    int itemsInList = list.ItemCount;
    Logger.WriteLine("Updating users at {0}. {1} items.", list.Title, itemsInList.ToString());

    SPQuery qry = new SPQuery();
    qry.ViewAttributes = "Scope=\"RecursiveAll\"";
    SPListItemCollection allItems = list.GetItems(qry);
    int count = 0;

    UpdateCount(count++, itemsInList);

    foreach (SPListItem item in allItems)
    {
        try
        {
            bool changed = false;
            SPFieldCollection allFields;
            //If the item has content type it has usualy less fields
            if (item.ContentType == null)
                allFields = item.Fields;
            else
                allFields = item.ContentType.Fields;

            foreach (SPField field in allFields)
            {
                if (field is SPFieldUser)
                    changed = ChangeUserToNewDomain(item, field, NewUsers) || changed;
            }

            changed = ChangeUserToNewDomain(item, item.Fields.GetFieldByInternalName("Author"), NewUsers) || changed;
            changed = ChangeUserToNewDomain(item, item.Fields.GetFieldByInternalName("Editor"), NewUsers) || changed;

            if (changed) item.SystemUpdate(false); //if the item has not been changed we won't update it to save time
        }
        catch (Exception ex) { Logger.WriteLine("Failed to update item {0}. Exception {1}", item.Title, ex.Message); }

        UpdateCount(count++, itemsInList);
    }

    UpdateCount(count++, 0);
}

private static bool ChangeUserToNewDomain(SPListItem item, SPField field, Dictionary<string, SPUser> NewUsers)
{
    bool changed = false;
    string fieldContent = item[field.InternalName] == null ? null : item[field.InternalName].ToString();

    if (string.IsNullOrEmpty(fieldContent)) return false;

    List<string> oldUserIds = GetUserIDs(fieldContent.Split(new string[] { ";#" }, StringSplitOptions.RemoveEmptyEntries));

    if (oldUserIds.Count == 1)
    {   //The field has only one user in it
        SPFieldUserValue foundUser = FindUser(NewUsers, oldUserIds[0]);

        if (foundUser != null)
        {
            item[field.InternalName] = foundUser;
            changed = true;
        }
    }
    else if (oldUserIds.Count > 1)
    {   //The field has several users in it
        SPFieldUserValueCollection usersInField = new SPFieldUserValueCollection();
        foreach (string oldUser in oldUserIds)
        {
            SPFieldUserValue foundUser = FindUser(NewUsers, oldUser);

            if (foundUser != null)
                usersInField.Add(foundUser);
        }

        if (usersInField.Count > 0)
        {
            item[field.InternalName] = usersInField;
            changed = true;
        }
    }
            
            
    return changed;
}

private static List<string> GetUserIDs(string[] UserTokens)
{   //We do not care about the login name. The ID is gold
    List<string> result = new List<string>();

    if (UserTokens.Length > 0)
    {
        for (int i = 0; i < UserTokens.Length; i++)
        {
            int id;

            if (i % 2 == 0 && int.TryParse(UserTokens[i], out id))
                result.Add(id.ToString());
        }
    }

    return result;
}

private static SPFieldUserValue FindUser(Dictionary<string, SPUser> NewUsers, string oldUser)
{
    SPUser foundUser = null;

    if (NewUsers.ContainsKey(oldUser)) foundUser = NewUsers[oldUser];
    else
    {
        //If we can't find the ID of the user we will still try with the login or even with the Display Name
        foreach (SPUser newUser in NewUsers.Values)
        {
            if (newUser.Name == oldUser || newUser.LoginName == oldUser) { foundUser = newUser; break; }
        }
    }

    if (foundUser != null)
        return new SPFieldUserValue(foundUser.ParentWeb, foundUser.ID, foundUser.Name);
    else
        return null;
}

private static string FixDomain(string loginName, string newDomain)
{
    //Here we change the users from XXXXX\\User to YYYYY\\User
    //The source domain was claim based
    if (loginName.Contains("|")) loginName = loginName.Split('|')[1];

    string[] tokens = loginName.Split('\\');

    tokens[0] = newDomain;

    return string.Join("\\", tokens);
}

private static void UpdateCount(int currentItem, int itemsInList)
{
    int percentage;

    if (currentItem == 0) percentage = 0;
    else if (itemsInList == 0) percentage = 100;
    else
    {
        //We will only change the value every 10 times to make the process faster.
        if (currentItem % 10 != 0) return;
        percentage = currentItem * 100 / itemsInList;
    }
    Console.Write("\r");
    if (percentage >= 0 && percentage < 10)
        Console.Write("  ");
    else if (percentage >= 10 && percentage < 100)
        Console.Write(" ");

    Console.Write("{0}%", percentage);
}

This is a first prototype that has worked as expected but it's not fully tested (by far) if you need it you can use it as a base to develop your own tool.

The one who possesses the strings has the power.

No comments:

Post a Comment

4.6.15

Geeky T-Shirt For Free... Count me on Xamarin!

I firmly believe that once the majority of the desktop computers of the world are capable of running Microsoft Store applications (Windows 10 will be released in July 29) the natural way for developing anything will be using the Microsoft Store I usually get excited whenever Microsoft says anything, excited or angry, but mostly excited because I am quite a naïve guy.

But even though, in the same way some android and IOS developers port their applications to the Microsoft Store, we might need to port our applications to other platforms, the other markets will be ridiculously small, but still.


I have been waiting for the right opportunity to give Xamarin a go and now, for a free t-shirt that we are exploring new platforms for a new tablet project, seems like the right moment.

The installation process in my Surface 3 Pro has not been easy. I struggled to make it run in VS2015 and started using the Xamarin Studio. I had one small issue with XS, nothing that 30 second in the Xamarin forums could not fix. All looked good, but then when I tried starting the application in the virtual devices I couldn't. Using real hardware was not an option because I only have Windows Phones at home 6.1 to 10.

I found the solution getting and installing the Xamarin Player, after that which also installed VBox, I downloaded and configured a Nexus 4 VM.

In the last step after you select which t-shirt you want you need to go to the code and do one small change. Even though the change was small it made me feel more inclined to hack through the code and modify it to do something else.



I can't wait to get the t-shirt and be one of the LINQ do the process if you want to know what i mean here.

Oh and the app has a lot of code that you will probably reuse too... another gift!



No comments:

Post a Comment

11.2.15

No more recursive functions to define CAML Queries thanks to Camlex

Some times you have a random number of conditions to check in a CAML query and in those cases I used to define the queries using a random recursive function that I usually have to debug a few times works perfectly on the first go.

The code for those queries would be something like this (and this is a simple one):

public List<string> GetSomeInfo(string fieldsToSearch, string contentTypesToSearch)
{
    ...

    var queryval = string.Empty;
    if (contentTypesToSearch.IsNullOrEmpty())
        queryval = string.Format("<Where>" + GenerateFieldsQuery(fieldsToSearch.Split(','), 0) + "</Where>", text);
    else
        queryval = string.Format("<Where><And>" + GenerateCTypesQuery(contentTypesToSearch.Split(','), 0) + GenerateFieldsQuery(fieldsToSearch.Split(','), 0) + "</And></Where>", text);

    var scope = "Scope=\"RecursiveAll\"";

    ...
}

private static string GenerateFieldsQuery(string[] fields, int index)
{
    if (fields.Length == 0) return string.Empty;

    if (fields.Length == index + 1)
        return "<Contains><FieldRef Name='" + fields[index] + "' /><Value Type='Text'>{0}</Value></Contains>";

    return "<Or><Contains><FieldRef Name='" + fields[index] + "' /><Value Type='Text'>{0}</Value></Contains>" + GenerateFieldsQuery(fields, ++index) + "</Or>";
}

private static string GenerateCTypesQuery(string[] cTypes, int index)
{
    if (cTypes.Length == 0) return string.Empty;

    if (cTypes.Length == index + 1)
        return "<Eq><FieldRef Name='ContentType' /><Value Type='Choice'>" + cTypes[index] + "</Value></Eq>";

    return "<Or><Eq><FieldRef Name='ContentType' /><Value Type='Choice'>" + cTypes[index] + "</Value></Eq>" + GenerateCTypesQuery(cTypes, ++index) + "</Or>";
}

That was until now... Thanks to Camlex (and thanks to Luis for showing it to me), that code can be written like this:

public List<string> GetSomeInfo(string fieldsToSearch, string contentTypesToSearch)
{
    ...

    var queryVal = string.Empty;
    var fieldExtensions = new List<Expression<Func<SPListItem, bool>>>();
    var cTypeExtensions = new List<Expression<Func<SPListItem, bool>>>();

    if (!contentTypesToSearch.IsNullOrEmpty())
    {
        foreach (var cType in contentTypesToSearch.Split(','))
            cTypeExtensions.Add(x => (string)x["ContentType"] == cType);
    }

    foreach (var field in fieldsToSearch.Split(','))
        fieldExtensions.Add(x => ((string)x[field]).Contains(text));

    var expressions = new List<Expression<Func<SPListItem, bool>>>();
    expressions.Add(ExpressionsHelper.CombineOr(cTypeExtensions));
    expressions.Add(ExpressionsHelper.CombineOr(fieldExtensions));

    queryVal = Camlex.Query().WhereAll(expressions).ToString();

    ...
}

I'll miss the recursive methods though... they made me feel special...

No comments:

Post a Comment

15.1.15

Win RT Universal app with a DocumentDB

NoSql databases have been brought to my attention a couple of thousand times in the last months and given that I am not the tidiest of the database designers and also that NoSql databases are supposed to be designed to escale I have decided to give them a go.

My platform of choice is Microsoft and looks like we have all the tools we need: Windows Apps that work the same desktops and phones and DocumentDB in Azure, fantastic, let's begin.

Let's start with the database, as it takes a while to start it will give you time to work in parallel with the visual studio in the meantime.

As I already have my azure account and everything set up I went straight to the creation of the database account: preview management portal DocumentDB Creation.

The process is explained in detail here: http://azure.microsoft.com/en-us/documentation/articles/documentdb-create-account/ 

First you need to specify a couple of parameters you won't see the page in Spanish necessarily, in fact I don't know why it is not showing it to me in English to set up the new database.



And once you are done you will be taken to a home page of the azure portal while you wait...


And while we wait we can go to the Visual Studio and start creating the projects we need. I usually say things like easy, piece of cake etc. but surely not so often today.

First of all I have created an universal app in Visual Studio 2013 as I am so outstandingly good with the user experience it will be the natural option for me to create two user interfaces, one for tablets and one for phones...


Now let's create a "Class Library (Portable for Universal Apps) to manage the DocumentDB connections and queries:


Once the DLL project was created I added a folder called Models with a class DbUser and a class Device:
namespace DocDbBridge.Models
{
    public class DbUser
    {
        public string Email { get; set; }
        public string Name { get; set; }
        public Device[] Devices { get; set; }
    }
}


namespace DocDbBridge.Models
{
    class Device
    {
        public string Name { get; set; }
        public string Brand { get; set; }
    }
}

Finally I went to the main class (which I called Connection) and added the following usings:


using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
using Microsoft.Azure.Documents.Linq;

It doesn't work because we are missing the Microsoft Azure DocumentDB Client Library 0.9.2-preview. In order to get it I have set the DLL to use .Net Framework 4.5.1


Then got Newtonsoft.Json from Nuget:


That creates a packages.config file in the project. In it I have added a line for the Microsoft.Azure.Documents.Client and rebuilt the project. The packages.config file looks like this:

<?xml version="1.0" encoding="utf-8"?>
<packages>
  <package id="Microsoft.Azure.Documents.Client" version="0.9.2-preview" targetFramework="portable-net451+win81+wpa81" />
  <package id="Newtonsoft.Json" version="6.0.8" targetFramework="portable-net451+win81+wpa81" />
</packages>

Finally, after the rebuild i have added a reference to the Microsoft.Azure.Documents.Client by browsing the project folders and finding the newly downloaded dll:


I have built the project again and it seems to be working, let's try to connect to the database now. Based in an example provided by Microsoft for the version 0.9.0 I have created a Connection class that goes like this:

using DocDbBridge.Models;
using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
using Microsoft.Azure.Documents.Linq;
using System;
using System.Linq;
using System.Threading.Tasks;

namespace DocDbBridge
{
    public class Connection : IDisposable
    {
        string endPoint = "https://chan.documents.azure.com:443/";
        string authKey = "Weird string with a lot of meaningless characters";

        DocumentClient client { get; set; }
        Database database { get; set; }

        DocumentCollection collection { get; set; }

        public Connection()
        {
             client = new DocumentClient(new Uri(endPoint), authKey);
             database = ReadOrCreateDatabase("QuickStarts");
             collection = ReadOrCreateCollection(database.SelfLink, "Documents");
             CreateDocuments(collection.SelfLink);
        }


        public DbUser QueryDocumentsLinq(string UserEmail)
        {
            // The .NET SDK for DocumentDB supports 3 different methods of Querying for Documents
            // LINQ queries, lamba and SQL


            //LINQ Lambda
            //return client.CreateDocumentQuery(collection.SelfLink).Where(u => u.Email == UserEmail).AsEnumerable().FirstOrDefault();
            return client.CreateDocumentQuery(collection.SelfLink).ToList().Where(u => u.Email == UserEmail).FirstOrDefault();
        }

        public DbUser QueryDocumentsSQL(string SqlQuery)
        {
            //3. SQL
            //var query = client.CreateDocumentQuery(collection.SelfLink, "SELECT * " +
            //                                                           "FROM UserDbs u " +
            //                                                           "WHERE u.email='andrew@stratex.com'");

            var query = client.CreateDocumentQuery(collection.SelfLink, SqlQuery);

            return query.AsEnumerable().FirstOrDefault();
        }

        public void Dispose()
        {
            Cleanup(database.SelfLink);
            client.Dispose();
        }

        #region Private Methods
        private Database ReadOrCreateDatabase(string databaseId)
        {
            // Most times you won't need to create the Database in code, someone has likely created
            // the Database already in the Azure Management Portal, but you still need a reference to the
            // Database object so that you can work with it. Therefore this first query should return a record
            // the majority of the time

            Database db = client.CreateDatabaseQuery().ToList().Where(d => d.Id == databaseId).FirstOrDefault();

                //db = client.CreateDatabaseQuery()
                //                .Where(d => d.Id == databaseId)
                //                .AsEnumerable()
                //                .FirstOrDefault();

            // In case there was no database matching, go ahead and create it. 
            if (db == null)
            {
                //Console.WriteLine("2. Database not found, creating");
                db = client.CreateDatabaseAsync(new Database { Id = databaseId }).Result;
            }

            return db;
        }

        private DocumentCollection ReadOrCreateCollection(string databaseLink, string collectionId)
        {
            DocumentCollection col = client.CreateDocumentCollectionQuery(databaseLink).ToList().Where(c => c.Id == collectionId).FirstOrDefault(); ;

                //col = client.CreateDocumentCollectionQuery(databaseLink)
                //                .Where(c => c.Id == collectionId)
                //                .AsEnumerable()
                //                .FirstOrDefault();

            // For this sample, if we found a DocumentCollection matching our criteria we are simply deleting the collection
            // and then recreating it. This is the easiest way to clear out existing documents that might be left over in a collection
            //
            // NOTE: This is not the expected behavior for a production application. 
            // You would likely do the same as with a Database previously. If found, then return, else create
            if (col != null)
            {
                //Console.WriteLine("3. Found DocumentCollection.\n3. Deleting DocumentCollection.");
                client.DeleteDocumentCollectionAsync(col.SelfLink).Wait();
            }

            //Console.WriteLine("3. Creating DocumentCollection");
            return client.CreateDocumentCollectionAsync(databaseLink, new DocumentCollection { Id = collectionId }).Result;
        }

        private void CreateDocuments(string collectionLink)
        {
            // DocumentDB provides many different ways of working with documents. 
            // 1. You can create an object that extends the Document base class
            // 2. You can use any POCO whether as it is without extending the Document base class
            // 3. You can use dynamic types
            // 4. You can even work with Streams directly.
            //
            // This sample method demonstrates only the first example
            // For more examples of other ways to work with documents please consult the samples on MSDN. 

            // Work with a well defined type that extends Document
            // In DocumetnDB every Document must have an "id" property. If you supply one, it must be unique. 
            // If you do not supply one, DocumentDB will generate a unique value for you and add it to the Document. 
            var task1 = client.CreateDocumentAsync(collectionLink, new DbUser
            {
                Email = "chan@stratex.com",
                Name = "Test user",
                Devices = new Device[]
                {
                    new Device { Name="Lumia 920", Brand="Nokia"},
                    new Device { Name="Surface 3", Brand="Microsoft"},
                }
             });

            var task2 = client.CreateDocumentAsync(collectionLink, new DbUser
            {
                Email = "andrew@stratex.com",
                Name = "Andrew",
                Devices = new Device[]
                {
                    new Device { Name="Lumia 925", Brand="Nokia"},
                    new Device { Name="Surface 3", Brand="Microsoft"},
                }
            });

            
            // Wait for the above Async operations to finish executing
            Task.WaitAll(task1, task2);
        }

        private void Cleanup(string databaseId)
        {
            client.DeleteDatabaseAsync(databaseId).Wait();
        }
        #endregion
    }
}

As you might have noticed you also need the URL and the Auth Key, you can get them from the azure portal which by now will probably have your database up and running:



After that I have added a reference to the DocDbBridge DLL in my app project:


I have, after that, executed my app, by the way, this is the code... Impressive


        private void Button_Click(object sender, RoutedEventArgs e)
        {
            try
            {
                using (Connection conn = new Connection())
                {
                    var user = conn.QueryDocumentsLinq("andrew@stratex.com");

                    if (user != null)
                        Result.Text = user.Name;
                    else
                        Result.Text = "User not found.";
                }
            }
            catch (Exception ex)
            {
                Result.Text = "Exception found";
            }
        }

And it has failed miserably because it needs again the Newtonsoft.Json package... I have installed it on the W8.1 app project.

After that I executed the project again, clicked my marvellous button and I got Exception: Microsoft.Azure.Documents.BadRequestException: Syntax error, unexpected end-of-file here:



client.CreateDatabaseQuery()
                                .Where(d => d.Id == databaseId)
                                .AsEnumerable()
                                .FirstOrDefault();

//And here:
client.CreateDocumentCollectionQuery(databaseLink)
                                .Where(c => c.Id == collectionId)
                                .AsEnumerable()
                                .FirstOrDefault();

It looks like there's something not working quite right in this release... I have changed the line to: 

Database db = client.CreateDatabaseQuery().ToList().Where(d => d.Id == databaseId).FirstOrDefault();

This is a huge issue because it means that you need to retrieve all of them first and then query the database... Completely Unusable! :( Could it be because I am using a piece of software that is in preview in a platform that it's not supported?... some would say yes...

Again in the Linq query the same issue, I had to change from: 

return client.CreateDocumentQuery<DbUser>(collection.SelfLink).Where(u => u.Email == UserEmail).AsEnumerable().FirstOrDefault();

To

return client.CreateDocumentQuery<DbUser>(collection.SelfLink).ToList().Where(u => u.Email == UserEmail).FirstOrDefault();

And I got again the same error when tried to execute the queries with SQL.

Wrapping up:
Upsides: It kind of works and that's really cool.
Downsides: What is a database if you cannot query? I would say... Not Ideal. 

But then again this is a preview software in an unsupported platform.... And it works more or less. Just imagine how cool would the software be in a couple of iterations!

No comments:

Post a Comment

23.8.13

Scopes in a CAML Query

I have been working for quite a while now with CAML queries and the scope is always something very important to bear in mind. How many times my queries returned nothing when I was sure they should bring back something…

Basically we have two modifiers Recursive and All and nothing, could we call nothing a modifier? All will bring back folders and files. Recursive will repeat the query in all the folders under the one we are working with.

If you don’t set the scope to All it will only bring files. If you don’t set it to recursive it will only retrieve items from the folder you are at. There are not that many variants so let’s make an example of each.

Let’s imagine we have a SharePoint folder like this one and we want to query it:

CamlScopeTreeSample

I am not good at paint, I know but what I want to show here is a tree where we have a Root folder (the root of the queries) and two sub-folders with files. For each scope possible I’ll highlight what you can expect to retrieve.

Just before we start, allow me to remind you that the scope is set in the property ViewAttributes of the SPQuery item.

To add a bit more of clarity I have also painted the levels:

CamlScopeTreeSampleLevels

The green line marks what’s inside the root folder, the blue line marks the contents of SubFolder1 and the red line SubFolder2.

ViewAttributes left by default:

CamlScopeByDefault
This is just the files under the root folder.

ViewAttributes = "Scope='Recursive'"

CamlScopeRecursive

This means all the files in all the folders.

ViewAttributes = "Scope='All'"

CamlScopeAll

This scope will bring folders and files under root.

ViewAttributes = "Scope='RecursiveAll'"

CamlScopeRecursiveAll

And finally with RecursiveAll you can bring back everything under the root entity.

Good luck with your queries.

No comments:

Post a Comment

30.4.13

How Much Are you Bringing Back In Your CAML Queries?

I don't know you but I am bringing back too much.

This is one of those things you don't notice until it's too late. I thought just adding very strict filters to the CAML Queries to bring back just the items you needed was enough but there's one more thing you can do.

You can restrict the fields you are retrieving... and you should.

The SQL Queries that SharePoint generates for retrieving the items when you let the ViewFields parameter of the CAML empty almost doubles the complexity of the one it generates when you specify the fields you want to query, or use a view.

Two simple lines like these:

query.ViewFields = "<FieldRef Name=\"Value\" />";
query.ViewFieldsOnly = true;
Can make your query better.

Who could resist to do it right when it's this simple?

No comments:

Post a Comment

Disposing every SPWeb you use... Is it necessary?

I am becoming a bit paranoid about disposing lately. In example.

Until now I would have used things like Web.ParentWeb and then forgotten about them. As I was not "Opening" them manually I was not disposing them either.

Now I usually do things like this:
using (SPWeb web = anotherWeb.ParentWeb)
{
      //Do Whatever you need here with the parent web
 }
I haven't had the time to check whether this is reducing the memory leaks or just making my application more unstable (if I close the web while I am using it somewhere else it could cause an exception it the other part of the code)

Do you think when you dispose a web it automatically disposes every parent web automatically?
It could be...



I need to test this and once I do it I'll post about it.

No comments:

Post a Comment

22.4.13

Editing SPListItems from SPWeb.GetSiteData

In my solution I have an unknown number of subsites that contain an unknown number of SPListItems that need to be updated. Looks like the perfect way of testing GetSiteData.

The documentation is useful and you'll be able to test the queries without problems really? yes, the documentation is ok but this method brings back a DataTable and what I needed was to update the SPListItems.

Well if you look at the DataTable you'll see that you have all the fields you need there to retrieve the SPListItem easily.

I have created an extension method, well two really... I love extension methods.
public static SPListItem GetListItemFromSiteData(this DataRow ItemRow, SPSite ParentSite)
{
    using (SPWeb Web = ItemRow.GetWebSiteData(ParentSite))
    {
        return Web.Lists[new Guid(ItemRow["ListId"].ToString())].GetItemById(Convert.ToInt32(ItemRow["ID"]));
    }
}

public static SPWeb GetWebSiteData(this DataRow ItemRow, SPSite ParentSite)
{
    return ParentSite.OpenWeb(new Guid(ItemRow["WebId"].ToString()));
}

Using these two you will be able to iterate through the collection of rows of the DataTable, select which elements need to be updated and update them.

I haven't tested how fast is this compared with bringing the items with a CAML query. Not having to create the SPListItemCollection could make this method faster and more convenient for some things... I'll give it a go.

No comments:

Post a Comment

17.4.13

The Clean Up Your Own Mess Pattern

Is this a pattern?

In most cases you want to change the value of a variable and leave it that way but in some occasions you want to change the value of a variable, perform some action and then set it back to whatever it was.

With this piece of code it's really easy and you are certain you won't forget. Ever.

The idea is to create a class that implements the IDisposable interface and saves the values to the state they were when we created it and set them to whatever we need to work. Later, in the Dispose, it changes back the variables to the initial values:
public class ManageAllowUnsafeUpdates : IDisposable
{
    private readonly bool allowUnsafeUpdatesStatus;
    private readonly SPWeb Web;

    public ManageAllowUnsafeUpdates(SPWeb Web)
    {
        this.Web = Web;
        allowUnsafeUpdatesStatus = Web.AllowUnsafeUpdates;
        Web.AllowUnsafeUpdates = true;
    }

    public void Dispose()
    {
        Web.AllowUnsafeUpdates = allowUnsafeUpdatesStatus;
    }
}

Using it later is as simple as:
using (new ManageAllowUnsafeUpdates(Web))
{
    ListItem[this.FieldName] = (String)value;
    ListItem.Update();
}

And you don't have to think any more if the value was set to true or false before because the class will manage that for you.

This is a silly piece code but I think serves well to demonstrate the idea.

No comments:

Post a Comment

Tracing Untraceable Exceptions

My application has been working nicely until today. Now, every time I launch it, it crashes with no obvious information but the dreaded CLR20r3.


Description:

  Stopped working



Problem signature:

  Problem Event Name: CLR20r3

  Problem Signature 01: appName.exe

  Problem Signature 02: 1.0.0.0

  Problem Signature 03: 516eb9fb

  Problem Signature 04: mscorlib

  Problem Signature 05: 2.0.0.0

  Problem Signature 06: 4a27471d

  Problem Signature 07: 20c8

  Problem Signature 08: 100

  Problem Signature 09: N3CTRYE2KN3C34SGL4ZQYRBFTE4M13NB

  OS Version: 6.1.7600.2.0.0.274.10

  Locale ID: 2057


I have read on the internet that I'm not the only one.

And the most annoying part is that if I try to debug it it won't fail...

Finally I found a blog post with answers Handling “Unhandled Exceptions” in .NET 2.0.

And I came up with this piece of code, that I placed in the Main method of my application:
AppDomain.CurrentDomain.UnhandledException += delegate(object sender, UnhandledExceptionEventArgs e)
 {
  EventLog.WriteEntry("appName", string.Format("{0} – IsTerminating = {1}", e.ExceptionObject.ToString(), e.IsTerminating), EventLogEntryType.Error, 001);
 };
Then I ran my app again and it crashed, but this time I got a NICE exception in the event viewer.

If you want to know, it was because it was not being able to load a DLL it was expecting

Thanks Mark.

No comments:

Post a Comment

13.4.13

Abstract Classes and Collection Types in Service Reference Settings

I have been adding some objects to one of my web services, they were representing a hierarchy of similar items so I used a nice abstract factory.

The code was working perfectly on the server and then I added one object as a response of a web service method.

Nice!

Finally I went to a solution that was using that web service with the naive intention of using it as i always do and updated it but to my surprise, after the update I got a couple of hundred errors in the code.

What used to be ObservableCollections were now Arrays.

I tried to change the Service Reference Settings back to normal but it was as usual with the Collection Type parameter set as Observable Collection. What was happening?

Back to the code I tried everything I could think of in a Saturday afternoon and nothing changed until I commented the abstract class. That made everything go back to normal.

In the end I have changed all the abstract classes to normal classes and the abstract methods to virtual. It's not the same and the code is awkward but now, at least I can use my web services as usual.

By the way, I had a lot of private setters I had to get rid of. I was getting:

error CS0200: Property or indexer "propertyName" cannot be assigned to -- it is read only
So I had to change the properties from this:
public string Label { get; private set; }
To this:
public string Label;

So annoying!

By the way, thanks Fiddler2.

I don't usually post about unsolved issues but I don't use to write one post a day either it is not my fault, it's The Ultimate Blog Challenge's ...



No comments:

Post a Comment

4.4.13

Dynamic Lock Object Provider

Sometimes you need to create a lock object dynamically, like in the post I wrote yesterday.

When you are retrieving a property from the property bag you don’t want to block the all the property bags for all the webs, you just want to make sure the property you are looking for has not changed since you started the lock and that needs a dynamic way of creating locks.

I started this class with a dictionary but after using it for a while I realized that dictionaries are not thread safe for reading… so I changed it to a hash table.
public class Locks
{
    /// <summary>
    /// This hashtable will provide a lock for every different key in the cache. Hashtables are threadsafe for one writer many readers scenario
    /// </summary>
    private static Hashtable LocksCollection = new Hashtable();

    public static object GetLock(string Key, params object[] args)
    {
        if (args!=null)
            Key = string.Format(Key, args);
           
        if (!LocksCollection.ContainsKey(Key))
        {
            lock (LocksCollection)
                if (!LocksCollection.ContainsKey(Key))
                    LocksCollection.Add(Key, new object());
        }

        return LocksCollection[Key];
    }
}

With this new addition in the method for setting web properties we just lock that web property bag leaving the other webs in the server unloked. It looks like this:
/// <summary>
/// Sets a web property to a given value. This method is thread safe.
/// </summary>
/// <param name="Key">The Key for the web property (case insensitive)</param>
/// <param name="Value">Value to set the property to</param>
public static void SetWebProperty(this SPWeb Web, string Key, string Value)
{
    Key = Key.ToLower();
    lock (Locks.GetLock("WebPropertyWriteLock - {0}", Web.ID))
    {
        if (GetWebPropertyThreadSafe(Web, Key) != Value)
        {
            Web.Properties[Key] = Value;

            Web.Properties.Update();
        }
    }
}

No comments:

Post a Comment

3.4.13

Sharing Data Between Web Front Ends With Thread Safe Web Properties

Remember back in the old days when you only had one frontend? Remember when you thought that was complex?

Times have changed and now for almost every production environment you have several frontends and that’s good, the performance is boosted and with SharePoint it’s simple to add one more if you feel you are falling short but this has also taken some tasks to a new level of complexity.

There is software out there like AppFabric that will allow you to persist data and share it between all your servers and they are good… but would you install and configure them everywhere just for sharing a 10 character string?

Well you don’t have to. You can use SharePoint Web Properties to do so. They are fast to write, to read, to code and you don’t need to configure anything.

SharePoint Web Properties are a bit tricky to use though, just a bit, particularly when you are using them in a multithreaded process but I hope these wrappers save you some time. (I have used them in a couple scenarios and work as expected but I can’t assure they are completely safe.)
static object PropertyWriteLock = new object();

/// <summary>
/// Sets a web property to a given value. This method is thread safe.
/// </summary>
/// <param name="Key">The Key for the web property (case insensitive)</param>
/// <param name="Value">Value to set the property to</param>
public static void SetWebProperty(this SPWeb Web, string Key, string Value)
{
    Key = Key.ToLower();
    lock (PropertyWriteLock) //It's better to have a lock just for the key we are working with. I'll post the trick maybe tomorrow.
    {
        if (GetWebPropertyThreadSafe(Web, Key) != Value)
        {
            Web.Properties[Key] = Value;

            Web.Properties.Update();
        }
    }
}

/// <summary>
/// Returns the web property with the given key. This method is thread safe.
/// </summary>
/// <param name="Key">The Key for the web property (case insensitive)</param>
/// <returns>Returns null if not found</returns>
public static string GetWebPropertyThreadSafe(this SPWeb Web, string Key)
{
    Key = Key.ToLower();
    using (SPSite site = new SPSite(Web.Site.ID))
    {
        using (SPWeb newWeb = site.OpenWeb(Web.ID))
        {
            return newWeb.GetWebProperty(Key);
        }
    }
}

/// <summary>
/// Returns the web property with the given key.
/// </summary>
/// <param name="Key">The Key for the web property (case insensitive)</param>
/// <returns>Returns null if not found</returns>
public static string GetWebProperty(this SPWeb Web, string Key)
{
    Key = Key.ToLower();

    return Web.Properties[Key];
}

/// <summary>
/// Removes the web property from the web
/// </summary>
/// <param name="Key">The Key for the web property (case insensitive)</param>
/// <remarks>The web property will remain there but set to null.</remarks>
public static void RemoveWebProperty(this SPWeb Web, string Key)
{
    if (Web.Properties.ContainsKey(Key))
        Web.Properties.Remove(Key);

    Web.Properties.Update();

    if (Web.AllProperties.ContainsKey(Key))
        Web.AllProperties.Remove(Key);

    Web.Update();
}

They have simplified some parts of my application a lot and, I’ll  say that again, they are fast. Give them a go and let me know your thoughts.

No comments:

Post a Comment

2.4.13

Iterating Lists in Small Chunks And Performance

A couple of months ago I created a post about it and in the conclusion I stated that it would probably be faster to act on the whole list by chunks than it is using the big query… It’s not.

I have done the test with a list of 100K items so you don’t have to and, depending on the size of the chunks it can take slightly longer or a lot longer.

If you are using a very small RowLimit (say 5) the number of times you have to perform the query in order to retrieve the whole list will make the process take twice as long or even longer. If you are using a bigger number (like 1000) the performance will be similar to the one achieved retrieving the whole list and processing it in just one query.

In summary, if you are looking for a small number of values and they are likely to be found in the first iterations of the paginated query then the performance boost will be obvious, in some cases I have noticed an average improvement of up to a 95%, but bear in mind that if you are using very small chunks and you are likely to go through the whole list to get the values back your query will be slower, up to an 400% slower or even more.

No comments:

Post a Comment

10.12.12

ObservableDictionary for Silverlight

I have reused copied the code from Shimmy's Blog but had to tweak it a bit to make it fit for Silverlight. At the moment I have used it just in one of my projects but works very well.

using System;
using System.Collections;
using System.Collections.Generic;
using System.Collections.Specialized;
using System.ComponentModel;
using System.Linq;

namespace SLUtils.Classes
{
        public class ObservableDictionary<TKey, TValue> : IDictionary<TKey, TValue>, INotifyCollectionChanged, INotifyPropertyChanged
        {
            private const string CountString = "Count";
            private const string IndexerName = "Item[]";
            private const string KeysName = "Keys";
            private const string ValuesName = "Values";

            private IDictionary<TKey, TValue> _Dictionary;
            protected IDictionary<TKey, TValue> Dictionary
            {
                get { return _Dictionary; }
            }

            #region Constructors
            public ObservableDictionary()
            {
                _Dictionary = new Dictionary<TKey, TValue>();
            }
            public ObservableDictionary(IDictionary<TKey, TValue> dictionary)
            {
                _Dictionary = new Dictionary<TKey, TValue>(dictionary);
            }
            public ObservableDictionary(IEqualityComparer<TKey> comparer)
            {
                _Dictionary = new Dictionary<TKey, TValue>(comparer);
            }
            public ObservableDictionary(int capacity)
            {
                _Dictionary = new Dictionary<TKey, TValue>(capacity);
            }
            public ObservableDictionary(IDictionary<TKey, TValue> dictionary, IEqualityComparer<TKey> comparer)
            {
                _Dictionary = new Dictionary<TKey, TValue>(dictionary, comparer);
            }
            public ObservableDictionary(int capacity, IEqualityComparer<TKey> comparer)
            {
                _Dictionary = new Dictionary<TKey, TValue>(capacity, comparer);
            }
            #endregion

            #region IDictionary<TKey,TValue> Members

            public void Add(TKey key, TValue value)
            {
                Insert(key, value, true);
            }

            public bool ContainsKey(TKey key)
            {
                return Dictionary.ContainsKey(key);
            }

            public ICollection<TKey> Keys
            {
                get { return Dictionary.Keys; }
            }

            public bool Remove(TKey key)
            {
                if (key == null) throw new ArgumentNullException("key");

                TValue value;
                Dictionary.TryGetValue(key, out value);

                int index = GetIndex(Dictionary, key);
                var removed = Dictionary.Remove(key);

                if (removed)
                    OnCollectionChanged(NotifyCollectionChangedAction.Remove, new KeyValuePair<TKey, TValue>(key, value), index);
                    //OnCollectionChanged();
                return removed;
            }

            public bool TryGetValue(TKey key, out TValue value)
            {
                return Dictionary.TryGetValue(key, out value);
            }

            public ICollection<TValue> Values
            {
                get { return Dictionary.Values; }
            }

            public TValue this[TKey key]
            {
                get
                {
                    return Dictionary[key];
                }
                set
                {
                    Insert(key, value, false);
                }
            }

            #endregion

            #region ICollection<KeyValuePair<TKey,TValue>> Members

            public void Add(KeyValuePair<TKey, TValue> item)
            {
                Insert(item.Key, item.Value, true);
            }

            public void Clear()
            {
                if (Dictionary.Count > 0)
                {
                    Dictionary.Clear();
                    OnCollectionChanged();
                }
            }

            public bool Contains(KeyValuePair<TKey, TValue> item)
            {
                return Dictionary.Contains(item);
            }

            public void CopyTo(KeyValuePair<TKey, TValue>[] array, int arrayIndex)
            {
                Dictionary.CopyTo(array, arrayIndex);
            }

            public int Count
            {
                get { return Dictionary.Count; }
            }

            public bool IsReadOnly
            {
                get { return Dictionary.IsReadOnly; }
            }

            public bool Remove(KeyValuePair<TKey, TValue> item)
            {
                return Remove(item.Key);
            }

            #endregion

            #region IEnumerable<KeyValuePair<TKey,TValue>> Members

            public IEnumerator<KeyValuePair<TKey, TValue>> GetEnumerator()
            {
                return Dictionary.GetEnumerator();
            }

            #endregion

            #region IEnumerable Members

            IEnumerator IEnumerable.GetEnumerator()
            {
                return ((IEnumerable)Dictionary).GetEnumerator();
            }

            #endregion

            #region INotifyCollectionChanged Members

            public event NotifyCollectionChangedEventHandler CollectionChanged;

            #endregion

            #region INotifyPropertyChanged Members

            public event PropertyChangedEventHandler PropertyChanged;

            #endregion

            public void AddRange(IDictionary<TKey, TValue> items)
            {
                if (items == null) throw new ArgumentNullException("items");

                if (items.Count > 0)
                {
                    if (Dictionary.Count > 0)
                    {
                        if (items.Keys.Any((k) => Dictionary.ContainsKey(k)))
                            throw new ArgumentException("An item with the same key has already been added.");
                        else
                            foreach (var item in items) Dictionary.Add(item);
                    }
                    else
                        _Dictionary = new Dictionary<TKey, TValue>(items);

                    OnCollectionChanged(NotifyCollectionChangedAction.Add, items.ToArray(), GetIndex(Dictionary, items.Keys.First())); //We notify the index of the first added key
                }
            }

            private void Insert(TKey key, TValue value, bool add)
            {
                if (key == null) throw new ArgumentNullException("key");

                TValue item;
                if (Dictionary.TryGetValue(key, out item))
                {
                    if (add) throw new ArgumentException("An item with the same key has already been added.");
                    if (Equals(item, value)) return;
                    Dictionary[key] = value;

                    OnCollectionChanged(NotifyCollectionChangedAction.Replace, new KeyValuePair<TKey, TValue>(key, value), new KeyValuePair<TKey, TValue>(key, item), GetIndex(Dictionary, key));
                }
                else
                {
                    Dictionary[key] = value;

                    OnCollectionChanged(NotifyCollectionChangedAction.Add, new KeyValuePair<TKey, TValue>(key, value), GetIndex(Dictionary, key));
                }
            }

            private void OnPropertyChanged()
            {
                OnPropertyChanged(CountString);
                OnPropertyChanged(IndexerName);
                OnPropertyChanged(KeysName);
                OnPropertyChanged(ValuesName);
            }

            private int GetIndex(IDictionary<TKey, TValue> Dictionary, TKey key)
            {
                int i = 0;
                foreach (var dictKey in Dictionary.Keys)
                {
                    if (dictKey.Equals(key)) return i;
                }

                return -1;
            }

            protected virtual void OnPropertyChanged(string propertyName)
            {
                if (PropertyChanged != null) PropertyChanged(this, new PropertyChangedEventArgs(propertyName));
            }

            private void OnCollectionChanged()
            {
                OnPropertyChanged();
                if (CollectionChanged != null) CollectionChanged(this, new NotifyCollectionChangedEventArgs(NotifyCollectionChangedAction.Reset));
            }

            private void OnCollectionChanged(NotifyCollectionChangedAction action, KeyValuePair<TKey, TValue> changedItem, int index)
            {
                OnPropertyChanged();
                if (CollectionChanged != null) CollectionChanged(this, new NotifyCollectionChangedEventArgs(action, changedItem, index));
            }

            private void OnCollectionChanged(NotifyCollectionChangedAction action, KeyValuePair<TKey, TValue> newItem, KeyValuePair<TKey, TValue> oldItem, int index)
            {
                OnPropertyChanged();
                if (CollectionChanged != null) CollectionChanged(this, new NotifyCollectionChangedEventArgs(action, newItem, oldItem, index));
            }

            private void OnCollectionChanged(NotifyCollectionChangedAction action, IList newItems, int index)
            {
                OnPropertyChanged();
                if (CollectionChanged != null) CollectionChanged(this, new NotifyCollectionChangedEventArgs(action, newItems, index));
            }

            public override string ToString()
            {
                return string.Format("ObservableDictionary<{0}> Count={1}", GetKeyAndValue(Dictionary.GetType().GetGenericArguments()), Count);
            }

            private object GetKeyAndValue(Type[] Types)
            {
                string result = string.Empty;

                foreach (Type type in Types)
                {
                    result += type.ToString() + ",";
                }

                if (!string.IsNullOrEmpty(result) && result.Length > 1)
                    result = result.Substring(0, result.Length - 1);

                return result;
            }
        }
}
Give it a go and let me know how it works.

No comments:

Post a Comment

15.11.12

Click works but Tapped Crashes on Windows 8

Our app has been rejected from the store because it crashes when the user taps in a GridView. As we don’t have a proper surface we were using the simulator to test our software but with the mouse tool not with the finger and it was working perfectly…

It crashes every time you tap in an item in a GridView and it’s funny, funny peculiar, not funny ha ha, because it crashes right in the tap, the code doesn't even get to the call-back, it just crashes. The crash info is not useful, as expected.

Looks like there’s a bug in the GridView and after it’s assigned to a populated DataContext it needs to be refreshed before it accepts Taps, even though it works fine with Clicks.

There’s no way or I haven’t found it of refreshing the UI in WinRT which I reckon it will solve this issue, so I have taken two different approaches to solve this bug.

  • Fast loading dashboards: I am preloading the contents of the fast loading dashboards once the app is executed. Doing this I get all of the GridViews populated in a page that is not being displayed to the users, in the background. Once the page where the GridView is selected by the user it’s rendered along with the GridView object refreshing it.
  • Slow loading dashboards: I am preloading the contents on this dashboards too but if the user clicks them right after the app loads they will still be loading causing the GridView to render just once after the DataContext is populated. In this cases I have changed the behaviour of the dashboard and I’m assigning the DataContext before and updating it several times, once every new chunk of data is received. This way the GridView gets refreshed once per chunk of new information and the issue with the tapping is solved.

The guys at the Windows Store App Lab said they will find a proper solution to this issue but I think this is the kind of bug that will be fixed through windows update. Anyway I’ll keep you posted if they come back with something new.

***UPDATE***
A Microsoft's expert pointed out that if you create and populate the GridView in code and add it to the page  it will work as expected. I've tested this and it's true.

The templates in my GridViews are complex so I'll make them user controls so I can create and populate them in code without loosing the MVVM approach. I still haven't tested this.

No comments:

Post a Comment

3.10.12

A Thread.Sleep that does not freeze the thread

I have been trying to add unit tests to my SharePoint code and it’s not an easy task. One of the main points I wanted to test was the ItemEventReceivers on a list. This works if you use the synchronous events such as ItemAdding or ItemUpdating but when it comes to test if the asynchronous events have taken place as expected you need to wait a bit.

Putting the test thread to sleep prevents the asynchronous events from happening (it’s on the same thread as the SharePoint code) and, even though your code is working when you are executing it manually, all of the tests fail so I have created a new class based on timers that will let the asynchronous events to trigger and execute. I call it Waiter.
public class Waiter : IDisposable
{
    public enum WaiterState
    {
        Waiting,
        TimedOut,
        Success,
        Error
    };

    System.Timers.Timer WaitTimer;
    ManualResetEvent manualResetEvent;
    int WaitCounter;

    private Waiter(int interval)
    {
        WaitCounter = 0;

        manualResetEvent = new ManualResetEvent(true);
        WaitTimer = new System.Timers.Timer() { AutoReset = false, Interval = interval };
        WaitTimer.Elapsed += new ElapsedEventHandler(WaitTimer_Elapsed);
    }

    void WaitTimer_Elapsed(object sender, ElapsedEventArgs e)
    {
        WaitCounter++;
        manualResetEvent.Set();
    }

    /// 
    /// Waits for the interval in milliseconds times number of times or once by default.
    /// 
    public static WaiterState Wait(int interval, int times)
    {
        try
        {
            using (Waiter WaiterClass = new Waiter(interval))
            {
                while (WaiterClass.WaitCounter <= times)
                {
                    WaiterClass.WaitTimer.Start();
                    WaiterClass.manualResetEvent.WaitOne();
                }
            }
        }
        catch
        {
            return WaiterState.Error;
        }

        return WaiterState.Success;
    }

    /// 
    /// Waits for the interval in milliseconds once.
    /// 
    public static WaiterState Wait(int interval)
    {
        return Wait(interval, 0);
    }

    void Dispose()
    {
        WaitTimer.Dispose();
    }
}
Give it a go.

No comments:

Post a Comment

Restarting SharePoint 2010 Timer Service Programmatically

I need to restart the Timer Service from an application and it took me some time to find the answer. It’s as easy as one could think.

Anyway here's the code:
/// 
/// Stops the SharePoint timer.
/// 
public static void TimerStop()
{
    ServiceController timerService = new ServiceController(Constants.SPTimerName);

    if (timerService.Status == ServiceControllerStatus.Running)
    {
        timerService.Stop();
        timerService.WaitForStatus(ServiceControllerStatus.Stopped, Constants.WaitingTimeout);
    }
}

/// 
/// Starts the SharePoint timer.
/// 
public static void TimerStart()
{
    ServiceController timerService = new ServiceController(Constants.SPTimerName);

    if (timerService.Status == ServiceControllerStatus.Stopped)
    {
        timerService.Start();
        timerService.WaitForStatus(ServiceControllerStatus.Running, Constants.WaitingTimeout);
    }
}
And by the way the constants are:
public static TimeSpan WaitingTimeout = new TimeSpan(0, 1, 30);
public static string SPTimerName = "SharePoint 2010 Timer";

No comments:

Post a Comment

17.9.12

Cannot import the following key file error in VS2012

I don’t know if this works as well in VS2010 or 2008 but in VS2012 is really easy to add the key to the pfx file.

If you are getting the infamous error:
Cannot import the following key file: <FileName>.pfx. The key file may be password protected. To correct this, try to import the certificate again or manually install the certificate to the Strong Name CSP with the following key container name: VS_KEY_<A Key>

You can solve it easily by:

Clicking properties in the project that contains the file that is causing the issue.

image

Then going to "Signing" and select again the file. You will be prompted for the password.

image

Enter the password and you are done.

No comments:

Post a Comment