Wednesday, November 28, 2018

File Uploads in Dynamics 365 For Operations

I was recently reviewing some file import methods and I came across a new (for me) method of uploading files from a client.  It turns out this is quite simple.

This calls a file finding dialog:
FileUploadTemporaryStorageResult uploadResult = Dynamics.AX.Application.File::GetFileFromUser();

This returns a System.IO.Stream:
System.IO.Stream stream = uploadResult.openResult();

These snippets were part of class that imported a 2 column spreadsheet in Excel.  Please note that X++ now supports using statements. Here is the full code for reference:

using System.IO;
using OfficeOpenXml;
using OfficeOpenXml.Style;
using OfficeOpenXml.Table;

public class PayrollCodeImport
{
    public static void main(Args _args)
    {
        System.String value = null;
        OfficeOpenXml.ExcelRange cell = null;
        PayrollCode payrollCode;
        //upload
        FileUploadTemporaryStorageResult uploadResult = Dynamics.AX.Application.File::GetFileFromUser();
        PayrollCodeId payrollCodeId;
        Description  description;

        if(!uploadResult)
        {
            return;
        }
        //construct a new spreadsheet document with the upload stream
        OfficeOpenXml.ExcelPackage package = new ExcelPackage(uploadResult.openResult());
        //OXMLSpreadsheetDocument_RU spreadsheet = OXMLSpreadsheetDocument_RU::constructFromStream(uploadResult.openResult());
        //get the worksheets from the workbook
        OfficeOpenXml.ExcelWorksheets worksheets = package.get_Workbook().get_Worksheets();
        //get the worksheet by name
        OfficeOpenXml.ExcelWorksheet payrollcodeWorksheet = worksheets.get_Item("Import") as OfficeOpenXml.ExcelWorksheet;
        //skip the header line
        for (int i = payrollcodeWorksheet.Dimension.Start.Row + 1; i <= payrollcodeWorksheet.Dimension.End.Row; i++)
        {
            value = null;
            cell = payrollcodeWorksheet.Cells.get_Item(i, 1);
            value = cell.Value;
            payrollCodeId = value;

            value = null;
            cell = payrollcodeWorksheet.Cells.get_Item(i, 2);
            value = cell.Value;
            description = value;

            if(payrollCodeId && description)
            {
                payrollCode.Clear();

                select forupdate payrollCode where payrollCode.PayrollCodeId == payrollCodeId;
                ttsbegin;
                if(!payrollCode.RecId)
                {
                    payrollCode.PayrollCodeId = payrollCodeId;
                    payrollCode.Description = Description;
                    payrollCode.insert();
                }

                ttscommit;
            }
        }
    }

}

Friday, August 24, 2018

Dynamics 365 FO Connection to ServiceBus

I recently received a request to connect D365 FO to an Azure ServiceBus in order to download messages that were being sent by another application.  During the course of development, I found that there were 2 sets of projects that were developing connecting pieces: Microsoft.Azure.ServiceBus and Microsoft.ServiceBus.  The first component, however, was tied to .Net 4.6.1 and D365 FO is tied to .Net 4.5.2.  So, using the Azure component was out of the question.  However there was no documentation on how to use the object model in the straight ServiceBus.  Fortunately, both projects use a similar approach and a similar object model that I was able to make adaptations.  There were 2 caveats.


  1. D365 does not appear to work well with .Net async approaches.
  2. D365 does not appear to work well with generics.
In order to get around this, I used a synchronous approach and passed any element that required using generics to a C# class that could do the processing. The Service Bus dll can be downloaded from https://www.nuget.org/packages/WindowsAzure.ServiceBus/4.1.11

Once downloaded, open it as a zip file (you may need to change the extension to zip) and extract out the dll.  Then add the dll as a reference in D365.  Once there, the following class can be used to connect and download messages.  Sending messages would use a similar logic.  Please note that my requirements were to use Topics and Subscriptions rather than Queues.  Also, please note that I save all of the connection parameters in a separate table.  That way the system may access different Topic/Subscription combinations as required.  Also, if the connection information is changed, then this can be reflected in data.

public class ServiceBusConnect
{
    public void new()
    {  
    }

    public void receiveMessages(str _topic, str _subscription)
    {

        str strBody;
        System.IO.Stream stream;
        System.IO.StreamReader reader;
        Microsoft.ServiceBus.Messaging.SubscriptionClient subscriptionClient;
        Microsoft.ServiceBus.Messaging.MessagingFactory messagingFactory;
        Microsoft.ServiceBus.Messaging.BrokeredMessage message;
        ServiceBusConnectionParameters parameters;
       
        select firstonly parameters where parameters.Topic == _topic && parameters.Subscription == _subscription;
        ServiceBusConnector.BrokeredMessageProcessor connector = new ServiceBusConnector.BrokeredMessageProcessor();
        messagingFactory = Microsoft.ServiceBus.Messaging.MessagingFactory::Create(parameters.ConnectionString,
                Microsoft.ServiceBus.TokenProvider::CreateSharedAccessSignatureTokenProvider(parameters.SharedAccessKeyName, parameters.SharedAccessKey));

        subscriptionClient = messagingFactory.CreateSubscriptionClient(parameters.Topic, parameters.Subscription);
        message = subscriptionClient.Receive(System.TimeSpan::FromSeconds(5));
        if(message != null)
            setPrefix("Messages");

        while(message != null)
        {
            strBody = connector.GetBrokeredMessageContents(message);
            info(strBody);
            subscriptionClient.Complete(message.LockToken);
            message = subscriptionClient.Receive(System.TimeSpan::FromSeconds(5));
        }
        subscriptionClient.Close();
    }

}

The table ServiceBusConnectionParameters contains all the data for connection to Service Bus.  The Topic and Subscription should be based on how your project's service is configured.  The connection string will be sb://<your project>.servicebus.windows.net.  The SharedAccessKeyName will be what ever is configured in your service bus.  The default is RootManageSharedAccessKey.  The shared access key is the primary key as displayed on your Service Bus Explorer.

As usual for using .net assemblies in D365, all class references must be fully qualified with their name spaces.  The parameter in the subscriptionClient.Receive call is a timeout.  If no messages are found within 5 seconds, the call is aborted and the message is set to null.  Without it, the system just hangs indefinitely waiting for messages.  The call to GetBrokeredMessageContents uses a call to the GetBody method on the message itselft.  This method requires a generic reference to type the message.  I could not, however, find a way to do that within D365.  The wrapper to deserialize the message was pretty straightforward.

using System.IO;
using Microsoft.ServiceBus.Messaging;


namespace ServiceBusConnector
{
    public class BrokeredMessageProcessor
    {
        public string GetBrokeredMessageContents(BrokeredMessage message)
        {
            string strResult;

            using (var stream = message.GetBody<Stream>())
            {
                using (var reader = new StreamReader(stream))
                {
                    strResult = reader.ReadToEnd();
                }
            }

            return strResult;
        }
    }
}



Tuesday, June 5, 2018

Ax 2009 - Payment Journal Report Slow

We have a client on Ax 2009 and was saying that the Payment Journal Report was slow (not completing).  One person in my group has seen the issue before.  Her solution was:

  1. Stop AOS
  2. Delete CustVendPaymReport from SSRS
  3. Restart SSRS service
  4. Backup and delete XPPIL file contents
  5. Start AOS
  6. Redeploy custVendPaymReport
  7. Run Full CIL
  8. Run DB sync
I'm not sure why this works.  My colleague seems to think it's because the report has an uncommitted transaction.  However, if that were the case, restarting SQL Server would have solved it.  Also, it does not explain why redeploying the report is part of the process.

I have 2 suspicions, but can't prove either.  One is that the queries involved in gathering the data is missing a needed index.  We tried running the Ola scripts to evaluate and rebuild statistics or reorg/rebuild indexes.  That did not help.  The other possibility is that somehow one of the db temp tables is surviving from run to run when it should not.

Dynamics 365 SSL Certificates Expire (HTTP 500)


I had a major issue today with my Impark local development environment.  Whenever, I tried to start an instance, I kept getting an HTTP 500 (internal Server Error).  Checking HTTP 500 with D365 got me nowhere.  I finally looked in my Event Viewer, I found a number of errors related to Expired Certificates.



Exception information:

    Exception type: ExpiredCertificateException

    Exception message: Expired certificate for id '61EF4024783735137DBA51E341E36A0AE6A89497'.



Checking that, I was able to find that the SSL Certificates used by D365 had expired.  It appears that the copy of my installation is a year old.  Checking further, I was able to find a way to renew the certificates.    Search and Open Manage Computer Certificates.






Select Personal -> Certificates.  Look For the DeploymentOneBox entries.


 

For each, double click and go Details -> Thumbprint.


 

Copy and Paste the Thumbprint to Notepad. Put each entry on a separate line.  When finished, replace all spaces with empty.  Once you have all 4 numbers, you can create a power shell script.  At the top of file, put:

Set-Location -Path "cert:\LocalMachine\My"



The for each Thumbprint, create the following 2 lines:

$OldCert = (Get-ChildItem -Path 2e10fea6fcd1a23d0ba5dfefe5ce2eb3d8915f1a)

New-SelfSignedCertificate -CloneCert $OldCert -NotAfter (Get-Date).AddMonths(999)



Where the long number is the Thumbprint.



The full script should look like:



Set-Location -Path "cert:\LocalMachine\My"



$OldCert = (Get-ChildItem -Path 2e10fea6fcd1a23d0ba5dfefe5ce2eb3d8915f1a)

New-SelfSignedCertificate -CloneCert $OldCert -NotAfter (Get-Date).AddMonths(999)



?$OldCert = (Get-ChildItem -Path 61ef4024783735137dba51e341e36a0ae6a89497)

New-SelfSignedCertificate -CloneCert $OldCert -NotAfter (Get-Date).AddMonths(999)



?$OldCert = (Get-ChildItem -Path 3591d2eb68e980666190fe2b4c7f2abf14e55871)

New-SelfSignedCertificate -CloneCert $OldCert -NotAfter (Get-Date).AddMonths(999)



?$OldCert = (Get-ChildItem -Path cc5ba410b7fbb04249020b181fdad0fcf999698a)

New-SelfSignedCertificate -CloneCert $OldCert -NotAfter (Get-Date).AddMonths(999)





You can run all the thumb prints one at a time or all together.  After each pair, you will get some output.  Copy and paste the new Thumbprint under the entry for the old one.  Your file should now look like:



Set-Location -Path "cert:\LocalMachine\My"



$OldCert = (Get-ChildItem -Path 2e10fea6fcd1a23d0ba5dfefe5ce2eb3d8915f1a)

New-SelfSignedCertificate -CloneCert $OldCert -NotAfter (Get-Date).AddMonths(999)

B977B19D03EDB88B4D5CD647DFA839D3E4AEBA23



?$OldCert = (Get-ChildItem -Path 61ef4024783735137dba51e341e36a0ae6a89497)

New-SelfSignedCertificate -CloneCert $OldCert -NotAfter (Get-Date).AddMonths(999)

47CF0D5220C9457ABEFB53AF026600ED24A00584



?$OldCert = (Get-ChildItem -Path 3591d2eb68e980666190fe2b4c7f2abf14e55871)

New-SelfSignedCertificate -CloneCert $OldCert -NotAfter (Get-Date).AddMonths(999)

1F710D2AE32F0202499C586F3A1B8B3A6FE1EC7A



?$OldCert = (Get-ChildItem -Path cc5ba410b7fbb04249020b181fdad0fcf999698a)

New-SelfSignedCertificate -CloneCert $OldCert -NotAfter (Get-Date).AddMonths(999)

9256AC02204C006D47AC10D1DD51DC27DF3B2A6F



Where the number under each pair is the new thumbprint.  Now in Visual Studio, open <D365 Install Drive>:\ C:\AOSService\webroot\web.config, wif.config and wif.services.config.  Once open, do a global replace in files of the Old Thumbprint with the new Thumbprint.  Save the files with a .config.new extension.  Open the file explorer and navigate to the webroot directory.  Take the existing config files and add the extension .old or .bak.  Remove the .new extensions from the 3 new files (this can’t be done in Visual Studio owing to permission issues).  Once complete, reboot and you should be working again.

Tuesday, May 1, 2018

Screen Resolution Issues

Recently, I was working for a new client.  When I logged into Ax, the font sizes seemed uncomfortably small.  The major problem, however, was that an SSRS report that I was working on had major presentation issues (fields cut off, sections not appearing, etc.).  When I exported the report to PDF (for once), the presentation was fine.  So, I decided to see if I could adjust the resolution of Ax to see if that was the problem.  Turned out it was.  Here is the link:

https://stoneridgesoftware.com/dynamics-ax-2012-high-resolution-screens-bad-relationship/

Wednesday, April 25, 2018

Financial Dimensions....Pesky Financial Dimensions

Finally...in code, how to retrieve a dimension value from a default dimension and a dimension name:

public str getDimensionValueByName(DimensionDefault _dimension, str _dimensionName)
{
    DimensionAttributeValueSetStorage dimStorage;
    Counter i;
    str result;
   
    dimStorage = DimensionAttributeValueSetStorage::find(_dimension);
    for(i = 1; i <= dimStorage.elements(); i++)
    {
        if(DimensionAttribute::find(dimStorage.getAttributeByIndex(i)).Name == _dimensionName)
        {
            result = dimStorage.getDisplayValueByIndex(i);
            break;
        }
    }
   
    return result;
}

How to get or create a default dimension:

public DimensionDefault getOrSetDefaultDimension()
{
    DimensionAttributeValueSetStorage   valueSetStorage = new DimensionAttributeValueSetStorage();
    DimensionDefault result;
 
    int i;
    DimensionAttribute dimensionAttribute;
    DimensionAttributeValue dimensionAttributeValue;
    str dimValue;  
 
    container conAttr = [d1Name, d2Name, d3Name];
    container conValue = [d1Value, d2Value, d3Value];
 
    for (i = 1; i <= conLen(conAttr); i++)
    {            
        dimensionAttribute = dimensionAttribute::findByName(conPeek(conAttr,i));
     
        if (dimensionAttribute.RecId == 0)
        {
            continue;
        }
     
        dimValue = conPeek(conValue,i);
     
        if (dimValue != "")
        {
            // The last parameter is "true". A dimensionAttributeValue record will be created if not found.
            dimensionAttributeValue =
                    dimensionAttributeValue::findByDimensionAttributeAndValue(dimensionAttribute,dimValue,false,true);
         
            // Add the dimensionAttibuteValue to the default dimension
            valueSetStorage.addItem(dimensionAttributeValue);
        }            
    }    
 
    result = valueSetStorage.save();
    return result;
}

Thursday, January 4, 2018

D365 - Security Fix Crashes SQL Server 2016

Happy New Year.

Yesterday, I get a call from the project team.  Apparently, they found a bug that needs an emergency fix.  This was a project that I haven't looked at in over a month.  Starting my VM, I found out that my SQL Server Instance was crashing.  The message that I kept getting was:

FCB::Open failed: C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\Binn\mssqlsystemresource.mdf
   1
   2(The system cannot find the file specified.)

My initial check indicated that that was a security issue that my logon user for my SQL Instance needed to granted access to the relevant directories.  However, my user was Network Service, so it should have full permissions for everything on the VM.

Further checking indicated that a security patch from MS was to blame.  Apparently, the files (both mdf and ldf) were moved to different location: C:\Program Files\Microsoft SQL Server\130\LocalDB\Binn.  Copying and pasting the files to the correct location solved the problem.

http://strictlysql.blogspot.ca/2016/11/sql-server-2016-sql-service-doesnt.html