Quantcast
Channel: Microsoft Dynamics 365 Community
Viewing all articles
Browse latest Browse all 10657

Creating flexible RESTful Services for Dynamics Part 3

$
0
0

Axxia Systems Ltd

 
This is a continuation from my previous post: [http://community.dynamics.com/ax/b/dynamicsax_wpfandnetinnovations/archive/2014/03/08/creating-flexible-restful-services-for-dynamics.aspx] where I introduce a flexible REST service that works with non-active directory sources.
 
I’ve used the principle of .net REFLECTION in order to satisfy the flexibility and extensibility requirements for my generic service. Reflection gives the programmer the flexibility to create instances of assemblies at runtime (as opposed to application start up). More information on the principle of “Reflection” can be found here: [http://msdn.microsoft.com/en-us/library/f7ykdhsy(v=vs.110).aspx]. When you use this principle in combination with “open-script” asp.net websites then something quite interesting becomes available to you… the ability to simply "drop-in", "replace" or "remove" class library files “on-the-fly” (while the application is actually in LIVE mode).
 

·         This is only available in “open-script” non-compiled asp.net websites.

 
When this happens, the IIS engine automatically spots a change to the website and invokes a dynamic compile. Given the lightweight nature of this type of application this typically only takes a few seconds (on a fast machine). New class libraries that are "dropped-into" the solution are available immediately OR if a class library is “hot-swapped” then the new business-logic is also available immediately. As you can imagine, this massively boosts the flexibility of the application.
 

·         The class libraries must have been pre-compile-tested before a hot-swap attempt is made.

·         “Open-script” websites should only be used for internal data exchanges in order to mitigate the risk of non-authorised sources injecting their own assemblies into your solution.

 
The primary exposed methods of the service are:
 

1.)   ProcessImmediate: attempts to process a data request sent on the current URL querystring.

2.)   ProcessBufferedRequest: attempts to process an accumulated data request that has been sent on consecutive URL querystrings since the last reset or successful processing.

 
The buffered request uses the standard asp.net “Session” object to concatenate sequenced URL querystrings:
 
Method: AddDataToChannel
[WebMethod(EnableSession = true)]
[ScriptMethod(UseHttpGet = true)]
publicstring AddDataToChannel(string data)
{
    string result, status, message;
    result = ""; status = "OK"; message = "Data appended to channel";
    result += "<brokerResponse>";
 
    try
    {
        Session["data"] += data;
    }
    catch (Exception ex)
    {
        status = "Error";
        message = ex.Message;
        goto FinishAddDataToChannel;
    }
 
FinishAddDataToChannel:
    result += "<brokerStatus>" + status + "</brokerStatus>";
    result += "<brokerMessage>" + message + "</brokerMessage>";
    result += "</brokerResponse>";
 
    return result;
}
 
 
The above method is necessary to overcome the inherent restriction around the amount of data that can be sent via a URL querystring, which is currently set to about 4,000 characters for Internet Explorer, however, some browsers have no theoretical upper limit. Conversely, there is no size-limit on the response but the settings within the web configuration section must be changed to suite your requirements. Because, we will be talking to MS Dynamics the receive timeout has been set to 10 min and the size of the response has been set to 2GB (max)… but you should never operate any data-exchange system at such thresholds.
 
Configuration: web.config
<netTcpBinding>
  <bindingname="NetTcpBinding_LNPS_DynamicScriptClass"closeTimeout="00:01:00"openTimeout="00:01:00"receiveTimeout="00:10:00"sendTimeout="00:01:00"transactionFlow="false"transferMode="Buffered"transactionProtocol="OleTransactions"hostNameComparisonMode="StrongWildcard"listenBacklog="10"maxBufferPoolSize="2147483647"maxBufferSize="2147483647"maxConnections="10"maxReceivedMessageSize="2147483647">
    <readerQuotasmaxDepth="32"maxStringContentLength="2147483647"maxArrayLength="16384"maxBytesPerRead="4096"maxNameTableCharCount="16384"/>
    <reliableSessionordered="true"inactivityTimeout="00:10:00"enabled="false"/>
    <securitymode="Transport">
      <transportclientCredentialType="Windows"protectionLevel="EncryptAndSign"/>
      <messageclientCredentialType="Windows"/>
    </security>
  </binding>
</netTcpBinding>
 
 
The asp.net website has to be switched to allow large SOAP requests to be sent via the URL querystring. This is done via a simple change to the [web.config] file.
 
Configuration: web.config
<webServices>
  <protocols>
    <addname="HttpGet"/>
    <addname="HttpPost"/>
  </protocols>
</webServices>
 
 
The real magic happens in the next two sections of code.
 
First we attempt to instantiate an object based on the string variable “serviceClassValue” (this is passed in via the URL request). At this point, the application has no idea whether this class actually exists…
 
// attempt to create requested service object
try
{
    classType = TypeDelegator.GetType(serviceClassValue);
    serviceClass = Activator.CreateInstance(classType);
}
catch (Exception ex)
{
    status = "Error";
    message = "Unable to create class [" + serviceClassValue + "] - " + ex.Message;
    goto FinishProcessRequest;
}
 
 
The Requester (i.e. system making the request) can send anything within this string variable and it’s up to the .net framework to see if that class actually exists within the current assembly manifest. If it doesn't then the Requester gets an error back. Otherwise, an instance of that class is created in memory and then an attempt is made to execute it with the parameter data that has been supplied in the current URL (or previous URL requests):
 
// attempt to invoke requested method
try
{
    paramValue = newobject[1];
    paramValue[0] = data;
    Session["status"] = "OK";
    Session["message"] = "";
    invocationMethodResult = classType.InvokeMember(invocationMethodValue, BindingFlags.InvokeMethod, null, serviceClass, paramValue);
    status = (string)Session["status"];
    if (status == "OK")
    {
        message += (message == "" ? "" : " - ") + "Invocation method executed OK";
    }
    else
    {
        message = (string)Session["message"];
        goto FinishProcessRequest;
    }
}
catch (Exception ex)
{
    status = "Error";
    message = "Unable to invoke [" + invocationMethodValue + "] - " + ex.Message;
    goto FinishProcessRequest;
}
 
 
At this point, the application has instantiated a Class object and invoked a Method which it theoretically had no knowledge about. On top of that, it didn’t even concern itself about vitally important things like interface-method-signatures or data-mappings required for those parameters. This is like programming with all the safeties off..!
 

·         Invocations to conventional .net methods will fail if the required number of parameters (and the correct data types) for method parameters are not applied.

 
The invoked class-method could have been a new assembly that has just been dropped-into the application manifest or simply hot-swapped on-the-fly… the application simply doesn’t care. In that respect it is acting like a true “Generic-Broker” and simply ferrying data-packets between systems. This makes it an incredibly powerful and flexible web-service, but additionally opens up the programming model to all sorts of potential abuse. Therefore, use of such a service needs to be regulated appropriately and implemented only where there is requirement for a high-degree of flexibility. The pros and cons are summarised below:
 
Pros:
The “Generic Broker” is middle-ware that doesn’t really check (or care) about interface requirements on either side of the request-response loop (“Requester” > “Target” > “Requester”). It is agnostic to the data received or sent and simply attempts execution of dynamically invoked classes/methods and reports success/failures as appropriate. It is resilient to system change (on either side) and therefore fundamentally upgrade-proof. Additional functionality can be added to it (or changed) on-the-fly without the need for recompilation or system downtime.
Cons:
It is now the responsibility of the Requester system to supply correct information, like Class & Method names and the appropriate data for the Target system. It is also the responsibility of the Target system to sanitise the request (if required) as this is now theoretically an open channel. No data checking is done by the “Generic Broker”. The only thing that the middle-ware will do is restrict who can use the service either via Active Directory or IP address filtering (see previous article) and audit the communication that takes place.
 
Now that we’ve discussed Reflection concepts, the final thing to cover is communication with the AOS, which must still be done via WCF. The hot-swappable class that invokes MS Dynamics functionality (i.e. “WCF > AIF”) is a separate class that gets dropped into the “App_Code” folder. Because we want the “openness” of an unrestricted channel, the application has been hooked up to the Dynamic X++ script service that I created in this article: [http://community.dynamics.com/ax/b/dynamicsax_wpfandnetinnovations/archive/2013/09/14/possibly-the-last-service-you-ll-need.aspx]
 
Class: DynamicsClass
publicstring ExecuteScript(string data)
{
    // Aif Service Client
    LNPS_DynamicScriptClassClient client = newLNPS_DynamicScriptClassClient();
 
    // Create an instance of the CallContext class.
    CallContext context = newCallContext();
 
    // attempt remote execution
    try
    {
        data = "str dynamicScript() { \n" + data + " \n}";
        string result = client.runScript(context, data);
        try
        {
            if (result.Substring(0, 5) == "Error")
            {
                HttpContext.Current.Session["status"] = "Error";
                HttpContext.Current.Session["message"] = result;
            }
        }
        catch { }
        return result;
    }
    catch (Exception ex)
    {
        return (ex.Message + ": " + ex.StackTrace);
    }
}
 
 
The “data” that is passed into the application (in this case passthru X++) is wrapped up in a function call and then sent to the AOS for dynamic compilation and execution. The results of this (success or not) are then passed back to the Requester to be dealt with accordingly.
 
In summary:

1.)   The Requester sends the Generic Broker a HTTP request (or many requests) with URL parameters. The data for all these requests are accumulated by a Session object.

2.)   Part of the URL parameters tells the Generic Broker which class to create and which method to invoke. Another section of the parameters contains the actual X++ script that will be executed on the AOS.

3.)   The AOS receives the script and attempts dynamic compilation and execution. If successful, the results are passed back through the channel.

4.)   The Generic Broker fields the response and sends it back to the Requester and audits the communication in a separate database.

 
In the next article I’ll illustrate the uses and flexibility of this type of synchronous data-communication, with the specific focus on rapid-application-development.
 
REGARDS
 
 

Viewing all articles
Browse latest Browse all 10657

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>