Quantcast
Channel: General .NET – Brian Pedersen's Sitecore and .NET Blog
Viewing all 167 articles
Browse latest View live

Sitecore ContentSearch Get Latest Version

$
0
0

The Sitecore ContentSearch API allows you to index content in either .NET Lucene or SOLR, dramatically speeding up retrieval of content, especially when querying items that are scattered across your content tree.

Content retrieved from the ContentSearch API is not Sitecore Content Items (of type Sitecore.Data.Items.Item), they are objects that you have defined yourself. This is why you will experience that when querying from the MASTER database index (SITECORE_MASTER_INDEX), you will receive one result per version (and one result per language) rather than one Item object containing all versions and languages.

To overcome this issue, Sitecore have added a few nifty fields to the index, for example the _latestversion field:

Latest Version Field from SOLR

Latest Version Field found in my SOLR index

So when querying the index, you can add the _latestversion field:

query = query.Where(item => item["_latestversion"].Equals("1")); 

BTW, _latestversion is defined in a constant value in Sitecore:

Sitecore.ContentSearch.BuiltinFields.LatestVersion;

MORE TO READ: 



Sitecore user:created event not fired on Membership.CreateUser

$
0
0

Sitecore have since 7.5’ish fired events each time you manipulate the users in the .NET Membership database:

<event name="user:created"></event>
<event name="user:deleted"></event>
<event name="user:updated"></event>
<event name="roles:usersAdded"></event>
<event name="roles:usersRemoved"></event>

But I noticed that the user:created event was not fired. This is because I call the .NET Membership provider directly:

string userNameWithDomain = "extranet\\myuser";
string password = "somepassword";
string email = "myuser@somewhere.com";
Membership.CreateUser(userNameWithDomain, password, email);

This call to Membership is not handled by Sitecore, thus no event is executed. To fix this I have found 2 solutions, one is not good and the other one is not good either.

SOLUTION 1: CALL THE SITECORE MEMBERSHIP PROVIDER DIRECTLY

This solution ignores the web.config settings and assume that you have not switched or overwritten the Membership Provider yourself. But it will fire the event though:

Sitecore.Security.SitecoreMembershipProvider provider = new Sitecore.Security.SitecoreMembershipProvider();
MembershipCreateStatus status = MembershipCreateStatus.Success;
provider.CreateUser(usernameWithDomain, password, email, "", "", true, null, out status);
if (status != MembershipCreateStatus.Success)
  throw new MembershipCreateUserException(status);

SOLUTION 2: RAISE THE EVENT YOURSELF

This solution requires you to raise the event yourself. You need encapsulate the call to create a user in your own class, and instruct everyone to never call Membership.CreateUser() directly:

MembershipUser user = Membership.CreateUser(usernameWithDomain, password, email);
Event.RaiseEvent("user:created", user);

I can see from other blog posts that the user events are not the most widely used events in the Sitecore toolbox. If you have found another solution to this problem please let me know.

MORE TO READ:


Sitecore ContentSearch – Get items from SOLR or Lucene – A base class implementation

$
0
0

Reading items from Sitecore is pretty straight forward:

Sitecore.Data.Items.Item item =
   Sitecore.Context.Database.GetItem("/sitecore/context/.../...");

And it is fast, unless you need to retrieve items from many paths, or need to retrieve every child of a certain base class. In these situations you resolve to using the built in ContentSearch, which is a Lucene or SOLR index.

When working with objects from the ContentSearch API you will have to create your own model classes that maps the indexed fields to class properties. This is done using an IndexFieldAttribute to the properties of the class that will represent the indexed data:

[IndexField("customerpage")]
public ID CustomerPageId {	get; internal set; }

[IndexField("customername")]
public string CustomerName { get; internal set; }

The default indexes called sitecore_core_index, sitecore_master_index and sitecore_web_index is born with a long list of default properties that is useful for every class. Because of this it makes sense to let every one of your model classes inherit from a base class that maps these fields for you.

So let’s code.

STEP 1: CREATE A BASE CLASS

This base class maps the most common fields. There are many more for you to explore, but this particular class have been the base class of a huge project that I have been working on for the past 4 years:

using System;
using System.Collections.Generic;
using System.ComponentModel;
using Sitecore.Configuration;
using Sitecore.ContentSearch;
using Sitecore.ContentSearch.Converters;
using Sitecore.Data;
using Sitecore.Data.Items;
using Sitecore.Diagnostics;

namespace MySearch
{
  [Serializable]
  public abstract class SearchResultItem
  {
    [NonSerialized]
    private Item _item;

    // Get the actual Sitecore item. Beware that using this property
    // will substantially slow your query, as it looks up the item
    // in Sitecore. Use with caution, and try to avoid using it in
    // LINQ or enumerations
    public virtual Item Item
    {
      get { return _item ?? (_item = GetItem()); } set { _item = value; }
    }

    // Returns the Item ID (in SOLR this is stored as a short GUID in the _group field)
    [IndexField(Sitecore.ContentSearch.BuiltinFields.Group)]
    [TypeConverter(typeof(IndexFieldIDValueConverter))]
    public virtual ID ItemId
    {
      get; set;
    }

    // This is a combined key describing the Sitecore item in details
    // For example: sitecore://web/{7102ee6b-6361-41ad-a47f-832002082a1a}?lang=da&ver=1&ndx=sitecore_web_index
    // With the ItemUri class you can extract the individual values like database, id, language, version
    [IndexField(Sitecore.ContentSearch.BuiltinFields.UniqueId)]
    [TypeConverter(typeof(IndexFieldItemUriValueConverter))]
    public virtual ItemUri ItemUri
    {
      get; set;
    }

    // Return the item language
    [IndexField(Sitecore.ContentSearch.BuiltinFields.Language)]
    public virtual string Language
    {
      get; set;
    }

    // Returns true if the item is the latest version. When reading from the
    // web database index, this will alwaus be true.
    [IndexField(Sitecore.ContentSearch.BuiltinFields.LatestVersion)]
    public bool IsLatestVersion
    {
      get; set;
    }

    // Returns the ID's of every parent sorted by top parent first
    [IndexField(Sitecore.ContentSearch.BuiltinFields.Path)]
    [TypeConverter(typeof(IndexFieldEnumerableConverter))]
    public virtual IEnumerable<ID> ItemAncestorsAndSelf
    {
      get; set;
    }

    // Returns the updated datetime
    [IndexField(Sitecore.ContentSearch.BuiltinFields.SmallUpdatedDate)]
    public DateTime Updated
    {
      get; set;
    }

    // Returns every template that this item implements and inherits
    [IndexField(Sitecore.ContentSearch.BuiltinFields.AllTemplates)]
    [TypeConverter(typeof(IndexFieldEnumerableConverter))]
    public virtual IEnumerable<ID> ItemBaseTemplates
    {
      get; set;
    }

    private Item GetItem()
    {
      Assert.IsNotNull(ItemUri, "ItemUri is null.");
      return Factory.GetDatabase(ItemUri.DatabaseName).GetItem(ItemUri.ItemID, ItemUri.Language, ItemUri.Version);
    }
  }
}

STEP 2: CREATE A MODEL CLASS FOR A SPECIFIC TEMPLATE

This example inherits from the SearchResultItem base class, and encapsulates a Customer template containing 2 fields, CustomerPage and CustomerName.

namespace MySearch
{
  [DataContract]
  [Serializable]
  public class CustomerModel : SearchResultItem
  {
    [DataMember]
    [IndexField("customername")]
    public string CustomerName { get; internal set; }

    [IndexField("customerpage")]
    public ID CustomerPageId { get; internal set; }
  }
}

STEP 3: USING THE BASE CLASS TO SEARCH USING PREDICATES

A Predicate are a Latin word for “making search soo much easier”. Predicates defines reusable static functions. When run, Predicates become part of the index query itself, further improving performance. So let’s start by making 3 predicates:

namespace MySearch
{
  public static class Predicates
  {
    // Ensure that we only return the latest version
    public static Expression<Func<T, bool>> IsLatestVersion<T>() where T : SearchResultItem
    {
      return searchResultItem => searchResultItem.IsLatestVersion;
    }

    // Ensure that the item returned is based on, or inherits from the specified template
    public static Expression<Func<T, bool>> IsDerived<T>(ID templateID) where T : SearchResultItem
    {
      return searchResultItem => searchResultItem.ItemBaseTemplates.Contains(templateID);
    }

    // Ensure that the item returned is a content item by checking that the
    // content root is part of the item path
    public static Expression<Func<T, bool>> IsContentItem<T>() where T : SearchResultItem
    {
      return searchResultItem => searchResultItem.ItemAncestorsAndSelf.Contains(ItemIDs.ContentRoot);
    }
  }
}

With these predicates in place, I can create a repository for my Customer items:

namespace MySearch
{
  public class CustomerModelRepository
  {
    private readonly Database _database;

    public CustomerModelRepository() : this(Context.Database)
    {
    }

    public CustomerModelRepository(Database database)
    {
      _database = database;
    }

    public IEnumerable<CustomerModel> GetAll()
    {
      return Get(PredicateBuilder.True<CustomerModel>());
    }

    private IEnumerable<CustomerModel> Get(Expression<Func<CustomerModel, bool>> predicate)
    {
      using (IProviderSearchContext context = GetIndex(_database).CreateSearchContext(SearchSecurityOptions.DisableSecurityCheck))
      {
        return context.GetQueryable<CustomerModel>()
          .Where(Predicates.IsDerived<CustomerModel>(new ID("{1EB6DC02-4EBD-427A-8E36-7D2327219B6C}")))
          .Where(Predicates.IsLatestVersion<CustomerModel>())
          .Where(Predicates.IsContentItem<CustomerModel>())
          .Where(predicate).ToList();
      }
    }

    private static ISearchIndex GetIndex(Database database)
    {
      Assert.ArgumentNotNull(database, "database");
      switch (database.Name.ToLowerInvariant())
      {
        case "core":
          return ContentSearchManager.GetIndex("sitecore_core_index");
        case "master":
          return ContentSearchManager.GetIndex("sitecore_master_index");
        case "web":
          return ContentSearchManager.GetIndex("sitecore_web_index");
        default:
          throw new ArgumentException(string.Format("Database '{0}' doesn't have a default index.", database.Name));
      }
    }
  }
}

The private Get() method returns every index item following these criteria:

  • Must implement or derive from the template with the specified GUID (the GUID of the Customer template) = Predicates.IsDerived
  • And must be the latest version = Predicates.IsLatestVersion
  • And must be a content item = Predicates.IsContentItem

The repository is used like this:

CustomerModelRepository rep = new CustomerModelRepository(Sitecore.Context.Database);
IEnumerable<CustomerModel> allCustomers = rep.GetAll();
foreach (CustomerModel customer in allCustomers)
{
  // do something with the customer
  customer.CustomerName;
}

I hope this introduction will help you create your own base class implementation and start making fast content searches.

MORE TO READ:

For more SOLR knowledge, you should read my colleague Søren Engel‘s posts about SOLR:

These resources could also be helpful:

 


Requesting Azure API Management URL’s

$
0
0

The Azure API Management is a scalable and secure API gateway/proxy/cache where you can expose your API’s externally and still have secure access.

In Azure API Management you create a “Product” which is a collection of API’s that is protected using the same product key.

2 Azure API Management products, protected with a key

2 Azure API Management products, protected with a key

The 2 products above contains a collection of API’s, and each product have it’s own key.

As a developer you can find the API Keys using the Azure API Management Service Developer Portal:

APIM Developer Portal

APIM Developer Portal

When clicking around you will end up finding the “Try it” button where you are allowed to test your API endpoints:

Try it button

Try it button

And here you can get the subscription key by clicking the icon shaped as an eye:

Find the key here

Find the key here

When calling any API, you simply need to add the subscription key to the request header in the field:

  • Ocp-Apim-Subscription-Key

This is an example on how to GET or POST to an API that is secured by the Azure API Management. There is many ways to do it, and this is not the most elegant. But this code will work in production with most versions of .NET:

using System;
using System.IO;
using System.Net;
using System.Text;

namespace MyNamespace
{
  public class AzureApimService
  {
    private readonly string _domain;
    private readonly string _ocp_Apim_Subscription_Key;

    public AzureApimService(string domain, string subscriptionKey)
    {
      _domain = domain;
      _ocp_Apim_Subscription_Key = subscriptionKey;
    }

    public byte[] Get(string relativePath, out string contentType)
    {
      Uri fullyQualifiedUrl = GetFullyQualifiedURL(_domain, relativePath);
      try
      {
        byte[] bytes;
        HttpWebRequest webRequest = (HttpWebRequest) WebRequest.Create(fullyQualifiedUrl);
        webRequest.Headers.Add("Ocp-Apim-Trace", "true");
        webRequest.Headers.Add("Ocp-Apim-Subscription-Key", _ocp_Apim_Subscription_Key);
        webRequest.Headers.Add("UserAgent", "YourUserAgent");
        webRequest.KeepAlive = false;
        webRequest.ProtocolVersion = HttpVersion.Version10;
        webRequest.ServicePoint.ConnectionLimit = 24;
        webRequest.Method = WebRequestMethods.Http.Get;
        using (WebResponse webResponse = webRequest.GetResponse())
        {
          contentType = webResponse.ContentType;
          using (Stream stream = webResponse.GetResponseStream())
          {
            using (MemoryStream memoryStream = new MemoryStream())
            {
              byte[] buffer = new byte[0x1000];
              int bytesRead;
              while ((bytesRead = stream.Read(buffer, 0, buffer.Length)) > 0)
              {
                memoryStream.Write(buffer, 0, bytesRead);
              }
              bytes = memoryStream.ToArray();
            }
          }
        }
        // For test/debug purposes (to see what is actually returned by the service)
        Console.WriteLine("Response data (relativePath: \"{0}\"):\n{1}\n\n", relativePath, Encoding.Default.GetString(bytes));
        return bytes;
      }
      catch (Exception ex)
      {
        throw new Exception("Failed to retrieve data from '" + fullyQualifiedUrl + "': " + ex.Message, ex);
      }
    }

    public byte[] Post(string relativePath, byte[] postData, out string contentType)
    {
      Uri fullyQualifiedUrl = GetFullyQualifiedURL(_domain, relativePath);
      try
      {
        byte[] bytes;
        HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(fullyQualifiedUrl);
        webRequest.Headers.Add("Ocp-Apim-Trace", "true");
        webRequest.Headers.Add("Ocp-Apim-Subscription-Key", _ocp_Apim_Subscription_Key);
        webRequest.KeepAlive = false;
        webRequest.ServicePoint.ConnectionLimit = 24;
        webRequest.Headers.Add("UserAgent", "YourUserAgent");
        webRequest.ProtocolVersion = HttpVersion.Version10;
        webRequest.ContentType = "application/json";
        webRequest.Method = WebRequestMethods.Http.Post;
        webRequest.ContentLength = postData.Length;
        Stream dataStream = webRequest.GetRequestStream();
        dataStream.Write(postData, 0, postData.Length);
        dataStream.Close();
        using (WebResponse webResponse = webRequest.GetResponse())
        {
          contentType = webResponse.ContentType;
          using (Stream stream = webResponse.GetResponseStream())
          {
            using (MemoryStream memoryStream = new MemoryStream())
            {
              byte[] buffer = new byte[0x1000];
              int bytesRead;
              while ((bytesRead = stream.Read(buffer, 0, buffer.Length)) > 0)
              {
                memoryStream.Write(buffer, 0, bytesRead);
              }
              bytes = memoryStream.ToArray();
            }
          }
        }
        // For test/debug purposes (to see what is actually returned by the service)
        Console.WriteLine("Response data (relativePath: \"{0}\"):\n{1}\n\n", relativePath, Encoding.Default.GetString(bytes));
        return bytes;
      }
      catch (Exception ex)
      {
        throw new Exception("Failed to retrieve data from '" + fullyQualifiedUrl + "': " + ex.Message, ex);
      }
    }

    private static Uri GetFullyQualifiedURL(string domain, string relativePath)
    {
      if (!domain.EndsWith("/"))
        domain = domain + "/";
      if (relativePath.StartsWith("/"))
        relativePath = relativePath.Remove(0, 1);
      return new Uri(domain + relativePath);
    }
  }
}

The service is simple to use:

AzureApimService service = new AzureApimService("https://yourapim.azure-api.net", "12a6aca3c5a242f181f3dec39b264ab5");
string contentType;
byte[] response = service.Get("/api/endpoint", out contentType);

MORE TO READ:


Webhook Event Receiver with Azure Functions

$
0
0

Microsoft Azure Functions is a solution to run small pieces of code in the cloud. If your code is very small and have only one purpose, an Azure Function could be the cost effective solution.

This is an example of a generic Webhook event receiver. A webhook is a way for other systems to make a callback to your system whenever an event is raised in the other system. This Webhook event receiver will simply receive the Webhook event’s payload (payload = the JSON that the other system is POST’ing to you), envelope the payload and write it to a Queue.

STEP 1: SET UP AN AZURE FUNCTION

Select an Function App and create a new function:

Create New Azure Function

Create New Azure Function

 

STEP 2: CREATE A NEW FUNCTION

Select New Function and from the “API & Webhooks”, select “Generic Webhook – C#:

Create Generic Webhook

Create Generic Webhook

Microsoft will now create a Webhook event receiver boilerplate code file, which we will modify slightly later.

STEP 3: ADD A ROUTE TEMPLATE

Because we would like to have more than one URL to our Azure Function (each webhook caller should have it’s own URL so we can differentiate between them) we need to add a route template.

Select the “Integrate” section and modify the “Route template”. Add {caller} to the field:

Add a Route Template

Add a Route Template

STEP 4: INTEGRATE WITH AZURE QUEUE STORAGE

We need to be able to write to an Azure Queue. In Azure Functions, the integration is almost out of the box.

Select the “Integrate” section and under “Outputs”, click “New Output”, and select the “Azure Queue Storage”:

Azure Queue Storage

Azure Queue Storage

Configure the Azure Queue Settings:

Azure Queue Settings

Azure Queue Settings

  • Message parameter name: The Azure Function knows about the queue through a parameter to the function. This is the name of the parameter.
  • Storage account connection: The connection string to the storage where the azure queue is located.
  • Queue name: Name of queue. If the queue does not exist (it does not exist by default) a queue will be created for you.

STEP 5: MODIFY THE BOILERPLATE CODE

We need to make small but simple modifications to the boilerplate code (I have marked the changes form the boilerplate code with comments):

#r "Newtonsoft.Json"

using System;
using System.Net;
using Newtonsoft.Json;

// The string caller was added to the function parameters to get the caller from the URL.
// The ICollector<string> outQueue was added to the function parameters to get access to the output queue.
public static async Task<object> Run(HttpRequestMessage req, string caller, ICollector<string> outQueue, TraceWriter log)
{
    log.Info($"Webhook was triggered!");

    // The JSON payload is found in the request
    string jsonContent = await req.Content.ReadAsStringAsync();
    dynamic data = JsonConvert.DeserializeObject(jsonContent);

    // Create a dynamic JSON output, enveloping the payload with
	// The caller, a timestamp, and the payload itself
    dynamic outData = new Newtonsoft.Json.Linq.JObject();
    outData.caller = caller;
    outData.timeStamp = System.DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss.fff");
    outData.payload = data;

	// Add the JSON as a string to the output queue
    outQueue.Add(JsonConvert.SerializeObject(outData));

	// Return status 200 OK to the calling system.
    return req.CreateResponse(HttpStatusCode.OK, new
    {
        caller = $"{caller}",
        status = "OK"
    });
}

STEP 6: TEST IT

Azure Functions have a built in tester. Run a test to ensure that you have pasted the correct code and written the correct names in the “Integrate” fields:

Test

Test

Use the Microsoft Azure Storage Explorer to check that the event was written to the queue:

Azure Storage Explorer

Azure Storage Explorer

STEP 7: CREATE KEYS FOR THE WEBHOOK EVENT SENDERS

Azure Functions are not available unless you know the URL and the key. Select “Manage” and add a new Function Key.

Function Keys

Function Keys

The difference between Function Keys and Host Keys are that Function Keys are specific to that function, but the Host Keys are global keys that can be used for any function.

To call your Azure Function, the caller need to know the URL + the key. The key can be send in more than one way:

  • Using the URL, parameter ?code=(key value) and &clientid=(key name)
  • In the request header, using the x-functions-key HTTP header.

STEP 8: GIVE URL AND KEY TO CALLING SYSTEM

This is a Restlet Client example that calls my function. I use the QueryString to add the code and clientid parameters:

MORE TO READ:

 


.NET Session state is not thread safe

$
0
0

When working with the .NET session state you should bear in mind that the HttpContext.Current.Session cannot be transferred to another thread. Imagine that you, from the Global.asax would like to read the SessionID each time a session is started:

// This method inside Global.asax is called for every session start
protected void Session_Start(object sender, EventArgs e)
{
  MyClass.DoSomethingWithTheSession(HttpContext.Current);
}

To speed up performance you wish to use a thread inside DoSomethingWithTheSession. The thread will read the Session ID:

public class MyClass
{															   
  public static void DoSomethingWithTheSession(HttpContext context) 
  {
    if (context == null)  
	  return;

    // Here the context is not null
	ThreadPool.QueueUserWorkItem(DoSomethingWithTheSessionAsync, context);
  }

  private static void DoSomethingWithTheSessionAsync(object httpContext)
  { 
    HttpContext context = (HttpContext)httpContext;
	
	// Oops! Here the context is NULL
	string sessionID = context.Session.SessionID; 
  }
}

The code above will fail because the HttpContext is not thread safe. So in DoSomethingWithTheSession(), the context is set, but in DoSomethingWithTheSessionAsync, the context will null.

THE SOLUTION: TRANSFER THE SESSION VALUES INSTEAD OF THE SESSION OBJECT:

To make it work, rewrite the DoSomethingWithTheSessionAsync() method to reteieve the values needed, not the HttpContext object itself:

public class MyClass
{															   
  public static void DoSomethingWithTheSession(HttpContext context) 
  {
    if (context == null)  
      return;

    // Transfer the sessionID instead of the HttpContext and everything is fine
    ThreadPool.QueueUserWorkItem(DoSomethingWithTheSessionAsync, 
      context.Session.SessionID);
  }

  private static void LogReportFeatureflagsAsync(object session)
  { 
    // This works fine, as the string is thread safe.
    string sessionID = (string)session;
	
    // Do work on the sessionID
  }
}

MORE TO READ:

 

C# Using Newtonsoft and dynamic ExpandoObject to convert one Json to another

$
0
0

The scenario where you convert one input Json format to another output Json is not uncommon. Before C# dynamic and ExpandoObject you would serialize the input Json to POCO model classes and use a Factory class to convert to another set of POCO model classes, that then would be serialized to Json.

With the dynamic type and the ExpandoObject you have another weapon of choice, as you can deserialize the input Json to a dynamic object, and convert the contents to another dynamic object that is serialized. Imagine the following input and output Json formats:

Input format:

{
	"username": "someuser@somewhere.com",
	"timeStamp": "2017-09-20 13:50:16.560",
	"attributes": {
		"attribute": [{
			"name": "Brian",
			"count": 400
		},
		{
			"name": "Pedersen",
			"count": 100
		}]
	}
}

Output format:

{
	"table": "USER_COUNT",
	"users": [{
		"uid": "someuser@somewhere.com",
		"rows": [{
			"NAME": "Brian",
			"READ_COUNT": 400
		},
		{
			"NAME": "Pedersen",
			"READ_COUNT": 100
		}]
	}]
}

Converting from the input format to the output format can be achieved with a few lines of code:

// Convert input Json string to a dynamic object
dynamic input = JsonConvert.DeserializeObject(myQueueItem);

// Create a dynamic output object
dynamic output = new ExpandoObject();
output.table = "USER_COUNT";
output.users = new dynamic[1];
output.users[0] = new ExpandoObject();
output.users[0].uid = input.username;
output.users[0].rows = new dynamic[input.attributes.attribute.Count];
int ac = 0;
foreach (var inputAttribute in input.attributes.attribute)
{
    var row = output.users[0].rows[ac] = new ExpandoObject();
    row.NAME = inputAttribute.name;
    row.READ_COUNT = inputAttribute.count;
    ac++;
}

// Serialize the dynamic output object to a string
string outputJson = JsonConvert.SerializeObject(output);

I’ll try to further explain what happens. The Newtonsoft.Json DeserializeObject() method takes a json string and converts it to a dynamic object.

The output Json is created by creating a new dynamic object of the type ExpandoObject(). With dynamic ExpandoObjects we can create properties on the fly, like so:

// Create a dynamic output object
dynamic output = new ExpandoObject();
// Create a new property called "table" with the value "USER_COUNT"
output.table = "USER_COUNT";

This would, when serialized to a Json, create the following output:

{
"table": "USER_COUNT"
}

To create an array of objects, you need to first create a new dynamic array and then assign an ExpandoObject to the position in the array:

// Create a dynamic output object
dynamic output = new ExpandoObject();
// Create a new array called "users"
output.users = new dynamic[1];
// An an object to the "users" array
output.users[0] = new ExpandoObject();
// Create a new property "uid" in the "users" array
output.users[0].uid = input.username;

This generates the following Json output:

{
	"users": [{
		"uid": "someuser@somewhere.com"
		}]
}

MORE TO READ:

Using C# HttpClient from Sync and Async code

$
0
0

The .NEt 4.5 C# System.Net.Http.HttpClient() is a very nice http client implementation, but can be tricky to use if you (like me) are not a trained asynchronous programming coder. So here is a quick cheat sheet on how to work around the Task<>, async and await methods when using the HttpClient().

UPDATE 2018-01-19: Removed the HttpClient from the Using clause to avoid having to instantiate a new HttpClient for every request, as this can lead to socket exhaustion. For more information, read this Improper Instantiation antipattern article from Sitecore.

EXAMPLE 1: HTTPCLIENT GET WITH RETURN VALUE:

THE GET METHOD:

private static readonly HttpClient _httpClient= new HttpClient();

public static async Task<string> Get(string queryString)
{
  string authUserName = "user"
  string authPassword = "password"
  string url = "https://someurl.com";

  // If you do not have basic authentication, you may skip these lines
  var authToken = Encoding.ASCII.GetBytes($"{authUserName}:{authPassword}");
  httpClient.DefaultRequestHeaders.Authorization = new System.Net.Http.Headers.AuthenticationHeaderValue("Basic", Convert.ToBase64String(authToken));

  // The actual Get method
  using (var result = await _httpClient.GetAsync($"{url}{queryString}"))
  {
    string content = await result.Content.ReadAsStringAsync();
    return content;
  }
}

THE USAGE:

// From synchronous code
string queryString = "?hello=world";
string result = Get(queryString).Result;

// From asynchronous code
string queryString = "?hello=world";
string result = await Get(queryString);

 

EXAMPLE 2: HTTPCLIENT PUT “FIRE-AND-FORGET” WITHOUT RETURN VALUE:

THE PUT METHOD:

private static readonly HttpClient _httpClient = new HttpClient();

public static async Task Put(string postData)
{
  string authUserName = "user";
  string authPassword = "password"
  string url = "https://someurl.com";
        
  // If you have no basic authentication, you can skip thses lines
  var authToken = Encoding.ASCII.GetBytes($"{authUserName}:{authPassword}");
  httpClient.DefaultRequestHeaders.Authorization = new System.Net.Http.Headers.AuthenticationHeaderValue("Basic", Convert.ToBase64String(authToken));
    
  // The actual put method including error handling
  using(var content = new StringContent(postData))
  {
    var result = await _httpClient.PutAsync($"{url}", content);
    if (result.StatusCode == HttpStatusCode.OK)
    {
      return;
    }
    else
    {
      // Do something with the contents, like write the statuscode and
      // contents to a log file
      string resultContent = await result.Content.ReadAsStringAsync();
      // ... write to log
    }
  }
}

THE USAGE:

// You can call the method from asynchronous
// and it will actually run asynchronous. In this fire-and-forget 
// pattern, there is no need to wait for the answer
Put("data");

// ... but if you will wait, simply call ".Wait()":
Put("data").Wait();

MORE TO READ:

 


Sitecore publish:end and publish:end:remote

$
0
0

In Sitecore, you have several ways of executing code after a publish. The publish:end and publish:end:remote events are the 2 most obvious choices.

There is a little confusion as to when in the publish pipeline these events are fired. In previous versions (prior to Sitecore 7.2), publish:end and publish:end:remote was fired for each language and each target, and publish:complete and publish:complete:remote was fired when the publish job was done. But in later versions, publish:end and publish:end:remote is also only fired once, when the current publish operation is completed.

The :remote events (publish:end:remote and publish:complete:remote) is fired on your remote (content delivery, or CD) servers.

Please note that the EventArgs are not the same for publish:end and publish:end:remote. So to properly handle publish events you need 2 different functions in your code.

To handle the publish:end event, you will need the following function:

public void PublishEnd(object sender, EventArgs args)
{
  var sitecoreArgs = args as Sitecore.Events.SitecoreEventArgs;
  if (sitecoreArgs == null)
    return;

  var publisher = sitecoreArgs.Parameters[0] as Publisher;
    return;

  var rootItem = publisher.Options.RootItem;
  
  // Do code
}

To handle the publish:end:remote event, you need the following function:

public void PublishEndRemote(object sender, EventArgs args)
{
  var sitecoreArgs = args as Sitecore.Data.Events.PublishEndRemoteEventArgs;
  if (sitecoreArgs == null)
    return;

  Item rootItem = Factory.GetDatabase("web").GetItem(new ID(sitecoreArgs.RootItemId));
  
  // Do code
}

And in the configuration you need to point to the proper method:

<sitecore>
  <events>
    <event name="publish:end">
      <handler type="MyNamespace, MyDll" method="PublishEnd"/>
    </event>
    <event name="publish:end:remote">
      <handler type="MyNamespace, MyDll" method="PublishEndRemote"/>
    </event>
  </events>
</sitecore>

MORE TO READ:

Sitecore 9 Tracker.Current.Session.Identify is replaced with Tracker.Current.Session.IdentifyAs

$
0
0

In Sitecore 9, Sitecore have decided to change how you identify named users, i.e. how you match a Contact with a user that is logged into your website. The  Tracker.Current.Session.Identify method is obsolete. It has been replaced with Tracker.Current.Session.IdentifyAs:

// Sitecore 8 identification method:
public static void IdentifyUser(string username)
{
  // Never identify an anonymous user
  if (username.ToLower() == "extranet\\anonymous")
	return;

  if (Tracker.Current != null && Tracker.Current.IsActive && Tracker.Current.Session != null)
  {
	Tracker.Current.Session.Identify(username);
  }
}

// Sitecore 9 identification method:
public static void IdentifyUser(string username)
{
  // Never identify an anonymous user
  if (username.ToLower() == "extranet\\anonymous")
	return;

  string identificationSource = "website";
  if (Tracker.Current != null && Tracker.Current.IsActive && Tracker.Current.Session != null)
  {
	Tracker.Current.Session.IdentifyAs(identificationSource, username);
  }
}

The IdentifyAs() takes 2 parameters:

  • Source: A string that identifies where this contact comes from (for instance, “twitter” or “website”)
  • Identifier: The identifier itself (username, email, customerID or any other string based on your implementation)

Thanks to Sitecore support for the claification.

MORE TO READ:

 

In Sitecore 9, the ProxyDisabler have been retired completely

$
0
0

Sitecore have finally retired the ProxyDisabler in Sitecore 9. Proxy items were the early version of item cloning and were deprecated in Sitecore 6. And now the ProxyDisabler have been removed.

There are no replacement. All you need to do is to remove the line from your code.

// Old Sitecore 5,6,7,8 code:
public void Put(Item source)
{
  using (new ProxyDisabler())
  {
    // Do stuff with your item
  }
}

// New Sitecore 9 code:
public void Put(Item source)
{
  // Do stuff with your item
}

MORE TO READ:

Sitecore 9 Configuration not available on Dependency Injection – LockRecursionException: Recursive upgradeable lock acquisitions not allowed in this mode

$
0
0

Form Sitecore 8.2, Sitecore have implemented Dependency Injection for their own classes. Sitecore uses Microsoft’s Dependency Injection library.

Sitecore uses dependency injection to inject many things, including configurations. Therefore, you cannot access configuration before after your code have been injected.

Take the following example:

using Sitecore.Configuration;

namespace MyCode
{
  public class ServicesConfigurator() : IServicesConfigurator
  {
    public void Configure(IServiceCollection serviceCollection)
    {
      // This line will fail:
      var configuration = Factory.GetConfiguration();
      serviceCollection.AddTransient<MyClass>();
    }
  }
}

This code will thrown an error:

[LockRecursionException: Recursive upgradeable lock acquisitions not allowed in this mode.]
System.Threading.ReaderWriterLockSlim.TryEnterUpgradeableReadLockCore(TimeoutTracker timeout) +3839391
System.Threading.ReaderWriterLockSlim.TryEnterUpgradeableReadLock(TimeoutTracker timeout) +45
Sitecore.Threading.Locks.UpgradeableReadScope..ctor(ReaderWriterLockSlim mutex) +107
Sitecore.DependencyInjection.ServiceLocator.get_ServiceProvider() +85 Sitecore.Configuration.Factory.<.cctor>b__0() +9
System.Lazy1.CreateValue() +709 System.Lazy1.LazyInitValue() +191 Sitecore.Configuration.Factory.GetConfiguration() +44

The implications is that none of your injected constructors can contain references to:

  • Databases
  • Site information
  • Settings

HOW TO WORK AROUND INJECTED CODE WITH CONSTRUCTORS:

Imaging you like to inject a repository with the Sitecore Database as a constructor. You also would like to inject the database of the current context.

First you create an interface:

using Sitecore.Data;

public interface IDatabaseFactory
{
  Database Get();
}

Then you create a concrete implementation of the interface:

public class ContextDatabaseFactory :IDatabaseFactory
{
  public Database Get()
  {
    return Sitecore.Context.Database;
  }
}

In the ServicesConfigurator you can now register the abstract implementation:

public class ServicesConfigurator : IServicesConfigurator
{
  public void Configure(IServiceCollection serviceCollection)
  {
    // The database factory to inject:
    serviceCollection.AddTransient<IDatabaseFactory, ContextDatabaseFactory>();
    // The class that needs the database in the constructor:
    serviceCollection.AddTransient<MyRepository>();
  }
}

And in the MyRepository you reference the IDatabaseRepository in the constructor instead of the concrete Sitecore Database implementation:

public class MyRepository
{
  private readonly IDatabaseFactory _database;
  
  public MyRepository(IDatabaseFactory database)
  {
    _database = database;
  }
  
  public void DoTheActualCode()
  {
    _database.Get().GetItem("/sitecore/content/...");
  }
}

Many thanks to my cool colleagues who helped forge this solution:

MORE TO READ:

 

Creating dynamic arrays and lists using Dynamic and ExpandoObject in C#

$
0
0

In this previous post, C# Using Newtonsoft and dynamic ExpandoObject to convert one Json to another, I described how you can use the dynamic keyword and the ExpandoObject class to quickly transform JSON without the need for any concrete implementations of either the source or destination JSON.

This is an example of a dynamic list where you do not know the number of objects in the output array:

dynamic output = new List<dynamic>();

dynamic row = new ExpandoObject();
row.NAME = "My name";
row.Age = "42";
output.Add(row);

USAGE IN REAL LIFE:

Imagine you need to convert the following JSON by taking only those rows where the age is above 18:

{
	"attributes": [{
			"name": "Arthur Dent",
			"age": 42,
		},
		{	"name": "Ford Prefect",
			"age": 1088,
		},
		{	"name": "Zaphod Beeblebrox",
			"age": 17,
		}]
}

The code to transform the JSON would look something like this:

// Convert input JSON to a dynamic object
dynamic input = JsonConvert.DeserializeObject(myQueueItem);

// Create a list of dynamic object as output
dynamic output = new List<dynamic>();

foreach (var inputAttribute in input.attributes)
{
  if (inputAttribute.Age >= 18)
  {
    // Create a new dynamic ExpandoObject
    dynamic row = new ExpandoObject();
	row.name = inputAttribute.name;
	row.age = inputAttribute.age;
	// Add the object to the dynamic output list
	output.Add(row);
  }
}

// Finally serialize the output array
string outputJson = JsonConvert.SerializeObject(output);

The output is this:

[
  {  "name": "Arthur Dent",
	 "age": 42,
  }, 
  {	 "name": "Ford Prefect",
	 "age": 1088,
  }
]

MORE TO READ:

Sitecore and WebApi

$
0
0

So you have some legacy WebApi code that needs to run in your Sitecore solution? Or are just just a WebApi expert and need to use your favorite tool in the toolbox? Fear not, WebApi will run fine in your Sitecore solution.

You don’t need to use the native Sitecore 8.2 support for WebApi, you can use your own routes as well, and implement your nasty controller selectors, formatters and message handlers.

The API routes can be registered as a processor in the /sitecore/pipelines/initialize pipeline:

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <sitecore>
    <pipelines>
      <initialize>
        <processor type="MyProject.RegisterApiRoutes, MyDll" />
      </initialize>
    </pipelines>
  </sitecore>
</configuration>

Please be aware, that Sitecore have taken the /api/sitecore and the /api/rest/ routes for it’s own code already, so use another route and you will avoid clashes with the Sitecore API.

This is my sample route registering, using the /myapi/ route instead of /api/:

using System.Web.Http;
using Newtonsoft.Json;
using Sitecore.Pipelines;

namespace MyProject
{
  public class RegisterApiRoutes
  {
    public void Process(PipelineArgs args)
    {
      HttpConfiguration config = GlobalConfiguration.Configuration;

      SetRoutes(config);
      SetSerializerSettings(config);
    }

    private void SetRoutes(HttpConfiguration config)
    {
      config.routes.MapHttpRoute("Features", "myapi/features", new { action = "Get", controller = "Feature" });
      config.routes.MapHttpRoute("Default route", "myapi/{controller}", new { action = "Get" });
    }

    private void SetSerializerSettings(HttpConfiguration config)
    {
      JsonSerializerSettings settings = new JsonSerializerSettings { ContractResolver = new DefaultContractResolver() };
      config.Formatters.JsonFormatter.SerializerSettings = settings;
      config.Formatters.Remove(config.Formatters.XmlFormatter);
      config.EnsureInitialized();
    }
  }
}

And I can implement my “Features” controller:

using System.Collections.Generic;
using System.Web.Http;

namespace Myproject
{
  public class FeatureController : ApiController
  {
    public dynamic Get()
    {
      return "hello world";
    }
  }
}

MORE TO READ:

 

C# Mask email address for GDPR reasons

$
0
0

UPDATE 2018-08-10: See this post SHA256 hashing email addresses for GDPR reasons for an even better masking approcah. Thanks to Inspector Cluedget for the tip.

This is a C# extension method that will mask your email address following this pattern:

  • If it’s not an email, the entire string will be masked (“this string” => “***********”)
  • If the first part of the email is shorter than 4 characters, the entire email will be masked (me@somewhere.com => *@*.*)
  • All other emails are masked leaving only the first and last characters of the name and domain (somebody@somewhere.com => s******y@s*******e.com)

THE EXTENSION METHOD:

using System;
using System.Text.RegularExpressions;

namespace MyNamespace
{
  public static class EmailMasker
  {
    private static string _PATTERN = @"(?<=[\w]{1})[\w-\._\+%\\]*(?=[\w]{1}@)|(?<=@[\w]{1})[\w-_\+%]*(?=\.)";

    public static string MaskEmail(this string s)
    {
      if (!s.Contains("@"))
        return new String('*', s.Length);
      if (s.Split('@')[0].Length < 4) 
        return @"*@*.*"; 
      return Regex.Replace(s, _PATTERN, m => new string('*', m.Length));
    }
  }
}

USAGE:

using MyNamespace;

public void TestMethod()
{
  string email = "someperson@somedomain.com";
  string maskedEmail = email.MaskEmail();
  // result: s********n@s********n.com
}

WHY?

With the new GDPR rules you must be very careful when storing emails or other personal information anywhere, including your log files. And you should never give out a log file containing email addresses to a third party, even when this third party is “just helping you with a totally unrelated code bug elsewhere”.

There are many approaches to ensure GDPR compliance. The best way is to remove any personal data from any log file. This is not always possible, feasible or practical, which is why pseudonymization or data masking approaches will come in handy.

MORE TO READ:

 


SHA256 hashing email addresses for GDPR reasons

$
0
0

This is a followup on the previous post C# Mask email address for GDPR reasons, where user Inspector Cluedget pointed out that masking (replacing characters with *) an email address in the log file is the least safest of the data masking approaches available.

This extension method will SHA256 hash the email address and add a fake domain name (to make the string look like an email address).

THE EXTENSION METHOD:

using System.Security.Cryptography;
using System.Text;

namespace MyNamespace
{
  public static class StringFormatter
  {
    public static string MaskEmail(this string s)
    {
      return SHA256(s) + "@domain.com";
    }

    private static string SHA256(string s)
    {
      SHA256Managed sha256 = new SHA256Managed();
      StringBuilder hash = new StringBuilder();
      byte[] hashArray = sha256.ComputeHash(Encoding.UTF8.GetBytes(s));
      foreach (byte b in hashArray)
      {
        hash.Append(b.ToString("x"));
      }
      return hash.ToString();
    }
  }
}

USAGE:

using MyNamespace;
 
public void TestMethod()
{
  string email = "someperson@somedomain.com";
  string maskedEmail = email.MaskEmail();
  // result: 14683d88281fc3ad43f39f8ceab111c96cc145be2a3feec98f914661f18d@domain.com
}

WHY?

With the new GDPR rules you must be very careful when storing emails or other personal information anywhere, including your log files. And you should never give out a log file containing email addresses to a third party, even when this third party is “just helping you with a totally unrelated code bug elsewhere”.

There are many approaches to ensure GDPR compliance. The best way is to remove any personal data from any log file. This is not always possible, feasible or practical, which is why pseudonymization or data masking approaches will come in handy.

MORE TO READ:

Sitecore from Rendering to Experience Editor

$
0
0

In Sitecore, how do you set up a template  and a rendering that works in the Experience editor? Here is the checklist:

STEP 1: CREATE A TEMPLATE

Create the template. For each field, use the “Title” field for all of your languages to display a user friendly name.

Template

Template

STEP 2: CREATE CUSTOM EXPERIENCE BUTTONS

In the CORE database, add a Field Editor Button to the /sitecore/content/Applications/WebEdit/Custom Experience Buttons folder.

Add the fields that are not editable directly from the Experience Editor (or add all fields from your template, it is OK that a field can be edited directly and from a custom button). Use the “Fields” field to add a pipe separated list of field names:

Custom Experience Buttons

Custom Experience Buttons

STEP 3: CREATE AN 128×128 PIXEL IMAGE OF YOUR CONTROL

Sitecore have a “take screenshot” feature, but a stylized image of your control is often better. Create a 128×128 pixel image (.png or .jpg) and upload it to the media library. The image must depict or represent the component you are about to create.

STEP 4: CREATE A RENDERING OR SUBLAYOUT

MVC project creates renderings, older ASPX project creates sublayouts.

Mark the rendering/sublayout as “Editable”.
Point the “Datasource Location” to where the datasource item should be placed. Use query: to select a dynamic location.
Select the template you created in STEP 1 in the “Datasource Template” field.

Sublayout

Sublayout

Select the custom experience button you created in STEP 2 in the “Experience Editor Buttons” field.

Sublayout

Sublayout

The field “Thumbnail” (found when selecting “Standard Fields”, under “Appearance”) is used for the 128×128 pixel image you created earlier.

Sublayout

Sublayout

Finally, use “Display Name” to give the rendering/sublayout a user friendly name. Remember to set a display name for all languages.

Sublayout

Sublayout

STEP 5: SETUP PLACEHOLDER SETTINGS

Add the rendering/sublayout from STEP 4 to the placeholder settings of the placeholders where you are allowed to add this component.

Placeholder Settings

Placeholder Settings

THE END RESULT:

The Placeholder Settings allows you to see the rendering/sublayout. The Thumbnail gives your control a nice presentation, and the Display Name of the rendering gives your control a user friendly name:

Select a Rendering

Select a Rendering

The rendering/sublayout Datasource Location groups the places where the datasource can be created. The Datasource Template locks the sublayout/rendering to one specific template, making it impossible for the editor to select the wrong template.

Select the Associated Content

Select the Associated Content

The Custom Experience Editor Button creates a easy access to edit the content of the rendering/sublayout. And the Title field for the template fields displays a user friendly name.

Editing fields

Editing fields

MORE TO READ:

Sitecore find Unused Sublayouts

$
0
0

Long lived Sitecore solutions tend to build up unused renderings and sublayouts as design, features and functions evolve. Finding those unused sublayouts it not just a matter of checking the Sitecore Link Database for sublayouts with no references because:

The problem seems easy at first: 1) Find all sublayots. 2) Find all pages using sublayots. 3) Those sublayouts not used on any pages are unused. And that’s what this code will check.

STEP 1: FIND ALL SUBLAYOUTS:

This repository returns every sublayout in your solution from a specified path:

using System.Collections.Generic;
using Sitecore.Data;
using Sitecore.Data.Items;

namespace MyCode
{
  public class SublayoutRepository
  {
    private readonly Database _database;
    private readonly List<SublayoutItem> _sublayouts = new List<SublayoutItem>();

    public SublayoutRepository(Database database)
    {
      _database = database;
    }

    public IEnumerable<SublayoutItem> Get(string rootPath)
    {
      _sublayouts.Clear();
      Item rootItem = _database.GetItem(rootPath);
      if (rootItem == null)
        return _sublayouts;
      Iterate(rootItem);
      return _sublayouts;
    }

    private void Iterate(Item item)
    {
      if (item.TemplateID == Sitecore.TemplateIDs.Sublayout)
        _sublayouts.Add(item);
      if (!item.HasChildren)
        return;
      foreach (Item child in item.Children)
      {
        Iterate(child);
      }
    }
  }
}

STEP 2: ITERATE ALL PAGES IN THE SOLUTION:

Finding used sublayouts are a little more tricky. Each item have a Visualization property and you need to iterate this to get the sublayouts from a specific device. In this example I have hard coded the device to “default“.

using System;
using System.Collections.Generic;
using System.Linq;
using PT.Project.Maintenance.Model.Repositories;
using Sitecore.Configuration;
using Sitecore.Data;
using Sitecore.Data.Items;

namespace MyCode
{
  public class FindUnusedSublayouts
  {
    private IEnumerable<SublayoutItem> _sublayouts;
    private List<SublayoutItem> _usedSublayouts = new List<SublayoutItem>();

    protected void Find()
    {
      _sublayouts = new SublayoutRepository(WebDatabase).Get("/sitecore/layout");
      _usedSublayouts.Clear();
      Item root = WebDatabase.GetItem("/sitecore/content");
      Iterate(root);
      var unusedSublayouts = _sublayouts.Where(s => !_usedSublayouts.Any(s2 => s2.ID == s.ID));
      foreach (var unusedSublayout in unusedSublayouts)
      {
        // Write the unused sublayout:
        // unusedSublayout.ID => The GUID
        // unusedSublayout.InnerItem.Paths.FullPath => The sublayout path
        // unusedSublayout.FilePath => The path to the .ascx file
      }
    }

    private void Iterate(Item root)
    {
      if (root.Visualization.Layout != null)
      { 
        foreach (var reference in root.Visualization.GetRenderings(DefaultDevice, false))
        {
          if (reference.RenderingItem != null)
          {
            try
            {
              SublayoutItem item = reference.RenderingItem.InnerItem;
              if (_usedSublayouts.All(s => s.ID != item.ID))
                _usedSublayouts.Add(reference.RenderingItem.InnerItem);
            }
            catch (Exception ex)
            {
              // Handle the exception, but do not 
              // stop the execution
            }
          }  
        }
      }
      if (!root.HasChildren)
        return;
      foreach (Item child in root.Children)
        Iterate(child);
    }

    private Database WebDatabase
    {
      get
      {
        return Factory.GetDatabase("web");
      }
    }

    private DeviceItem _device;

    private DeviceItem DefaultDevice
    {
      get
      {
        return _device ?? (_device = WebDatabase.Resources.Devices.GetAll().First(d => d.Name.ToLower() == "default"));
      }
    }

  }
}

MORE TO READ:

END NOTE:

I know that WebForms, .aspx and .ascx are dying technologies, and I encourage everyone to switch to Sitecore MVC. But a lot of older Sitecore solutions cannot switch to MVC for many reasons. Therefore it is important to help those who are maintaining large, long running Sitecore solutions.

Sitecore 9 Dependency Injection – Extend the Sitecore Logging

$
0
0

With the extended use of Dependency Injection (DI) in Sitecore 9, you have yet another tool to extend the Sitecore functionality. With DI you can basically replace or extend standard Sitecore functionality with your own code.

Dependency Injection is not a replacement for Sitecore Pipelines, but a supplement.

Behold the following case: Replacing the Sitecore.Diagnostics.Log with your own implementation. In this case I will replace the Sitecore.Diagnostics.DefaultLog with my own implementation.

STEP 1: REGISTER THE NEW LOG CLASS

The Sitecore.Diagnostics.DefaultLog inherits from Sitecore.Abstractions.BaseLog, so my ServicesConfigurator needs to register a new class for that:

using Microsoft.Extensions.DependencyInjection;
using Sitecore.DependencyInjection;

namespace MyCode
{
  public class ServicesConfigurator : IServicesConfigurator
  {
    public void Configure(IServiceCollection serviceCollection)
    {
      serviceCollection.AddTransient(typeof(Sitecore.Abstractions.BaseLog), typeof(MyCode.MyLog));
    }
  }
}

The ServicesConfigurator must be registered in Sitecore using a .config file. Make sure your config file is adding your section later than Sitecore. he who registers last, have the DI registration:

<?xml version="1.0" encoding="utf-8"?>
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/" xmlns:env="http://www.sitecore.net/xmlconfig/env/">
  <sitecore>
    <services>
      <configurator type="MyCode.ServicesConfigurator, MyDll" />
    </services>
  </sitecore>
</configuration>

STEP 2: IMPLEMENT THE NEW LOG CLASS

That’s basically it.

Now I can replace Sitecore functionality by implementing a class that inherits from Sitecore.Abstractions.BaseLog:

using System;
using Sitecore.Caching;

namespace MyCode
{
  public class MyLog : Sitecore.Abstractions.BaseLog
  {
    public override void Audit(string message, string loggerName)
    {
      // Do something else
    }
	
	...
	...
	...
}

Or I can extend Sitecore functionality by inheriting from Sitecore.Diagnostics.DefaultLog:

using System;
using Sitecore.Diagnostics;

namespace MyCode
{
  public class MyLog: Sitecore.Diagnostics.DefaultLog
  {
    public override void Audit(string message, object owner)
    {
      // Adding the calling type to the message, thus extending the
	  // existing log messages
	  base.Audit(owner.GetType() + " " + message, owner);
    }

	...
	...
	...
  }
}

MORE TO READ:

Solr delete document using UI and querystring

$
0
0

How do you delete a document from Solr?

You can use query string parameters to do the delete:

https://[server]:8983/solr/[core]/update?commit=true&stream.body=<delete><query>[query]</query></delete>
  • server: The name of your Solr server
  • core: The name of the Solr core (the Solr index) to delete from
  • query: The Solr query returning the document to delete

Example:

https://localhost:8983/solr/sitecore_master_index/update?commit=true&stream.body=<delete><query>_group:b74d1721779538ddb695afe69ce5f461</query></delete>

MORE TO READ:

Viewing all 167 articles
Browse latest View live