Tuesday, December 12, 2017

IIS Compatibility Issues

Innocently working along in a small ASP.Net project (first time in a couple of years), a client noticed that a text area was displaying too thinly (about 50 pixels wide) when it was supposed to cover the full width of a tab page.  After checking around, I found that it was probably to compatibility mode in her IE11 settings.  Setting a meta tag to for IE9 compatibility solved the issue.

https://serverfault.com/questions/142721/iis-displaying-page-differently-when-localhost-is-used-in-url-vs-hostname
https://msdn.microsoft.com/en-us/library/cc288325(VS.85).aspx
https://msdn.microsoft.com/en-us/library/jj676915(v=vs.85).aspx

Monday, November 27, 2017

Allowing IIS Application Pools to Access Local Folders

I was working recently with a client's ASP.Net application and had a requirement to log various items to a file log.  One of the issues was how to grant the correct user read/write permissions to folder in question.  This article explains how to grant folder permissions to the user running a given application pool.

https://docs.microsoft.com/en-us/iis/manage/configuring-security/application-pool-identities

Essentially, it involves granting "IIS AppPool\DefaultAppPool" or other application pool name (as defined in IIS) the appropriate permission.

Friday, November 17, 2017

D365 - Windows Server 2016 Datacenter Re-Activation

The Windows edition on my VM for client development expired.  It then started shutting down every 1/2 hour.  Needless to say this is quite annoying.

After some googling, I found that the evaluation edition of Windows can be re-armed.  The command to do this in the command prompt is:

slmgr.vbs -dli

If that doesn't work, try:
slmgr.vbs -rearm

https://organicax.com/2014/10/03/windows-rearm
https://support.microsoft.com/en-us/help/948472/how-to-extend-the-windows-server-2008-evaluation-period

The articles say that the extension is 60 days,  however my VM says 180.

Thursday, October 26, 2017

Decrypt Web.Config for D365

If you need to get into a D365 database that is located on a D365 VM.  The credentials are located under <install drive>:\aosservice\\webroot.  However, they are encrypted.  To decrypt them, you just need to run the following in a console window:

<install drive>::\AOSService\webroot\bin\Microsoft.Dynamics.AX.Framework.ConfigEncryptor.exe -decrypt <install drive>::\AOSService\webroot\web.config

Just make sure that you open the console as an administrator.  Normal users do not have access to the RSA keys.

Monday, October 2, 2017

Note to self...

Do not define string EDTs with length of 4000 characters.  Ax uses Unicode, so 2000 characters is 4000 bytes.  As a result, inserts will generate an Invalid Precision Value error in SQL Server.  Use a memo type instead.

https://community.dynamics.com/ax/f/33/t/177918


Tuesday, September 19, 2017

Ax 2012 - Queries and OData

Yesterday, I had an assignment to export a data table to Excel.  The current recommendations were to use either a list page or an SSRS report.  I had heard a number of references to OData, but did not know much about it.  It turns out that it was really easy to set up a feed.  Along the way, I learned something nice about AOT Query objects.

After a little reading, I found that any AOT Query object can be exposed as an OData Source.  My only issue was that the data that I needed came from a complex collection of tables (read unrolling dimensions again).  As a result, I decided to create an in-memory table and populate it using a stored procedure from the database.  That part was simple enough (just remember that the database calling method must be marked as server.

In general, I prefer to work with in-memory temp tables.  This is because tempDB is one of the busiest places in SQL Server.  From what I've gathered, every query that is executed against more that 1 table is resolved there.  So, the general advice is, if  you don't have to use tempDB, don't. 

The only issue that I had was how to use a query on using an empty temp table as a data source.  It turns out that AOT Query objects are based on QueryRun.  Literally, the class declaration reads:

public class QueryRun extends ObjectRun

So, to populate the query, all that you need is to call setCursor() on the init method.

public void init()
{
    super();
    this.setCursor(MyInMeomryTempTable::populate());
}


From there, exposing the Query was a simple matter of going to Organization Administration -> Setup -> Document Management and opening Document data sources.  Once there, simple add a new document data source.  The module is arbitrary and is just used for organizing the list of data sources.  Data Source Type needs to be Query Reference.  Then, select the name of the Query Object that you created and check Activated.  Description is optional.



Links: 
https://blogs.msdn.microsoft.com/aif/2011/08/23/odata-query-service/
http://immerhier.com/connect-microsoft-power-bi-to-ax-2012-odata-query-service/
http://www.uxceclipse.com/odata-powerquery-and-microsoft-dynamics-ax-2012-data-sources/


Friday, September 1, 2017

Dynamics 365 - Extending Sales and Purchase Documents

I almost got it right on my first attempt.  I was recently asked to update the PO Report in Dynamics 365.  In the extension model, a report is updated by doing the following:

  1. Create table extensions if extra data is required
  2. Create a copy of the report you wish to update
  3. Create a derived class of the controller that handles the report you are updating.
  4. Change the main method to point to the new report.
  5. Create a derived class of the Data Provider and override the processReport method to gather any new data required (don't forget to call super() to perform the original queries).
  6. Repoint the report's data sources to the derived data provider.
  7. Reformat the report as required.
  8. Extend all menu items that point to the original controller.  Repoint them to the new controller (object property).
  9. Compile the project, deploy the report and perform a synchronization.

In most cases, this is all that would be required to update an existing report.  However, the Purchasing and Sales Documents maintain a reference to the original document under print management.  So, even if you process the correct controller and data provider an you are point to the correct report; Dynamics 365 will return a blank report in the original format.

In order to define the correct default, you need to update the print management setting for the document type.  The process is the same for both Sales and Purchasing


Write a delegate that adds your extended report to the list of available reports for the document type.
public class CPPrintMgtDocTypeHandlersExt
{
    [SubscribesTo(classstr(PrintMgmtDocType), delegatestr(PrintMgmtDocType, getDefaultReportFormatDelegate))]
    public static void getDefaultReportFormatDelegate(PrintMgmtDocumentType _docType, EventHandlerResult _result)
    {
        switch (_docType)
        {
            case PrintMgmtDocumentType::PurchaseOrderRequisition:
                _result.result(ssrsReportStr(CPPurchPurchaseOrder, Report));
                break;
        }
    }
}

In this case, an extended report is added to the Purchase Orders list.

Then set the report type as default for both the original and copy.  For Purchase Orders, go to Accounts Payable -> Setup -> Forms -> Form Setup. 





Then click Print management setup.  Once inside, select your document and right click.  Select New from the menu.  Fill in the other details and open the Report Format drop down.  If the delegate was set up correctly, then the new report format should appear.  Repeat these steps to add a default format for the copy as well.





Wednesday, August 30, 2017

Manually Add the Descriptor

Another task on this project had me update a label in the middle of a very large method that is called from 49 other places in D365.  Between changing 50 places in code as a set of derived classes and extensions or just customize the sys layer, I opted for the latter.  So, I created a customization project in a new model under ApplicationSuite.  The change took a few minutes.  I had to create new label files, write the labels, change the line of code.  I checked everything into VSTS correctly.  Then, a colleague did a test build and we found out that the new model that I created did not get registered by D365.  As a result, it could not be built from the Dynamics 365 menu.

A first pass at trying this had me export the project and emailing him the file.  My colleague then imported the file.  Then we found that the model was correctly created.  However, this did not seem like the right solution.

Contacting another colleague led me to the right solution.  Apparently, each model has a summary xml file under the Descriptor directory for the main model.  Customization models will have a file in the Descriptor folder of the model being customized (e.g. ApplicationSuite).  Extension models will have a Descriptor folder under the root (e.g. <<AosService>>\\PackagesLocalDirectory\<<Model Name>>.  In either case, the file for the descriptor needs to be manually added to VSTS.

Once a build machine gets the latest, the new model should be added.


Tuesday, August 29, 2017

Dynamics 365 -- Custom Lookups

I just started working on Dynamics 365 over the last few weeks.  It's been a bit of a rocky ride working with the new extension model.  I recently had a requirement to add the ProjId to the PurchaseReqTable form and limit the option to projects that have been assigned to the current user.

In order to do this, I created a lookup in-memory table and populated it with the correct data.  However, when I wrote the lookup code, I kept on getting an error,  “More than one form was opened at once for the lookup control.”  I looked and looked and looked and could not find a resolution.  I had thought it was because the ProjId has an EDT relation to the ProjTable and also had a predefined lookup form.  This would have meant that all of this would have be undone.  Furthermore, it would mean that the customized lookup would have to be used on all forms that used a ProjId.  This was not a good approach.  By sheer luck, I finally found a post at https://ievgensaxblog.wordpress.com/2016/05/16/ax-7-how-to-override-form-control-methods-using-extensions/  that explained how to suppress the system generated lookup.

My version of the code looked like:

[FormControlEventHandler(formControlStr(PurchReqTable, PurchReqLine_ProjId), FormControlEventType::Lookup)]
 public static void PurchReqLine_ProjId_OnLookup(FormControl sender, FormControlEventArgs e)
    {
        CPLookupTmp lookup;
        SysTableLookup sysTableLookup;
        FormControlCancelableSuperEventArgs csea = e as FormControlCancelableSuperEventArgs;
        ;
        lookup = CPLookupTmp::populatePropertiesForUser();
        lookup.setTmpData(lookup);
        sysTableLookup = SysTableLookup::newParameters(tableNum(CPLookupTmp), sender);
        sysTableLookup.addLookupfield(fieldNum(CPLookupTmp,CPId));
        sysTableLookup.setLabel('@CP:PropertyId');
        sysTableLookup.addLookupfield(fieldNum(CPLookupTmp,CPValue));
        sysTableLookup.setLabel('@CP:PropertyName');
        sysTableLookup.parmTmpBuffer(lookup);
        sysTableLookup.performFormLookup();
        csea.CancelSuperCall();
    }

The important point here is the new class: FormControlCancelableSuperEventArgs .  The CancelSuperCall method is what stops the base lookup form from being called.

Tuesday, July 4, 2017

Referencing .Net Assemblies from Ax

Much Easier


I've now moved to an environment that uses Dynamics Ax 2012 R3.  Today I was asked how to reference a .Net Assembly from Ax code.  It took a little trial and error, but it was very easy.


First I created a C# Class Library project in Visual Studio.  The project contained one class and 2 method.





using System;
namespace AxTestClassProject
{
    public class AxTestClass
    {
        public static string HelloWorld()
        {
            return "Hello World";
        }
        public string TestInstanceMethod()
        {
            return "Test Successful";
        }
    }
}




I then compiled the project and added it to the AOT.  Next, I had to manually copy and paste the DLL from the bin\debug directory to the bin directory of the client.


Next, inside the AOT, I added a reference to the new DLL.





Finally, I created a job that referenced the 2 methods.  I experimented with code access permissions, but found they were not needed.  I only commented out, because it is needed if dynamic typing is used.



static void TestAxTestClassProject(Args _args)
{
    //InteropPermission perm;
    System.String clrStr;
    AxTestClassProject.AxTestClass testClass = new AxTestClassProject.AxTestClass();
    str axStr;
    /*perm = new InteropPermission(InteropKind::ClrInterop);
    if (perm == null)
    {
        return;
    }
    perm.assert();*/
    clrStr = AxTestClassProject.AxTestClass::HelloWorld();
    axStr = clrStr;
    info(axStr);
   
    clrStr = testClass.TestInstanceMethod();
    axStr = clrStr;
    info(axStr);
    //CodeAccessPermission::revertAssert();
}





Running the job produced the expected results.





Comments
  1. Notice that all references to .Net classes must be fully qualified. There are no using or imports statements in 2012 R3.
  2. I decided on being cautious by direct marshalling .Net strings to x++ strings.  In this case, it does not appear to be needed.  However, if any manipulations at required between .Net primitive types and x++ primitive types, these direct assignments are required.
  3. For 2012 R3, the .Net project must be linked to (target framework set to)  .Net 4.0 (i.e. not the client framework).
  4. You should be able to reference any FCL classes or other assemblies in the GAC without additional references.



Tuesday, May 9, 2017

Ax Configuration Files and SSRS

We had a new issue here.  One of our pre-production AOS needed to be rebuilt.  After our first deployment to the new environment, we found that SSRS reports were generating errors.  We were getting a pattern of 2 warnings, an error and 2 more warnings.


1) This warning was received twice:
Unable to get version information using existing WCF configuration from local storage.

Exception details:
Microsoft.Dynamics.AX.Framework.Services.Client.MetadataServiceException: Exception occurred on the metadata service on client or server. See exception details below:

2) The error was:
AXRDCE The AXRDCE extension caught an unexpected exception for report MWW_PYGByLocation.pdPYGByLocation.
The error message was:
Exception occurred on the metadata service on client or server. See exception details below:
>Unable to log on to Microsoft Dynamics AX.
   at Microsoft.Dynamics.AX.Framework.Services.Client.ServiceClientHelper.InvokeChannelOperation[TResult,TChannel](IServiceClient`1 client, Func`2 operationInvoker, Func`2 exceptionWrapper)

Then the warnings above repeated themselves twice.

Also, when I tried running any report from the SSRS server, I kept getting an error about not being able to set or retrieve the default value for parameter "CompanyName".

After researching the normal list of solutions, such as: restarting the AOS, clean the caches, doing a full CIL compile and refreshing the client side services; I decided to research the issue around the company name.  This finally led me to the following Tech Net article:  https://technet.microsoft.com/en-us/library/hh389774.aspx.

It turns out that the Business Connector information and authentication methods are not stored in SSRS reports as I had previously thought.  Rather, they are held in a designated configuration file (it must be named Microsoft.Dynamics.AX.ReportConfiguration.axc) that must be placed <SSRS Server path>\Reporting Services\ReportServer\bin directory.

After restarting SSRS, everything was working as norma.

Wednesday, March 8, 2017

Finally...A discussion of how to unroll dimension hierarchies

A quick note today.  I finally found a good discussion of how to unroll dimension hierarchies (such as default dimensions on Sales Orders/Lines or Purchase Orders/Lines)

http://axmasterclass.com/blog/financial-dimensions-deep-dive/

Working through the notes, I finally developed this query (first pass and will probably need improvement.

SELECT CT.ACCOUNTNUM [Acct #], DTC.NAME [Account Name],
DAVSI.DISPLAYVALUE [Location Num], DT.NAME [Location Name],
SUM(CIT.LINEAMOUNT) [Total Sales]
FROM DIMENSIONATTRIBUTEVALUESETITEM DAVSI
INNER JOIN DIMENSIONATTRIBUTEVALUE DAV ON DAVSI.DIMENSIONATTRIBUTEVALUE=DAV.RECID
     AND DAVSI.PARTITION = DAV.PARTITION
INNER JOIN DIMENSIONATTRIBUTE DA ON DAV.DIMENSIONATTRIBUTE=DA.RECID
     AND DAV.PARTITION = DA.PARTITION
INNER JOIN DIRPARTYTABLE DT ON DT.OMOPERATINGUNITNUMBER = DAVSI.DISPLAYVALUE
     AND DT.OMOPERATINGUNITTYPE=6
     AND DT.INSTANCERELATIONTYPE = 2377
INNER JOIN CUSTINVOICEJOUR CIJ ON CIJ.DEFAULTDIMENSION = DAVSI.DIMENSIONATTRIBUTEVALUESET
     AND CIJ.PARTITION = DAVSI.PARTITION
INNER JOIN CUSTINVOICETRANS CIT ON CIT.INVOICEID = CIJ.INVOICEID
     AND CIT.INVOICEDATE = CIJ.INVOICEDATE
     AND CIT.SALESID = CIT.SALESID
     AND CIT.NUMBERSEQUENCEGROUP = CIJ.NUMBERSEQUENCEGROUP
INNER JOIN CUSTTABLE CT ON CT.ACCOUNTNUM = CIJ.ORDERACCOUNT
     AND CT.DATAAREAID = CIJ.DATAAREAID
     AND CT.PARTITION = CIJ.PARTITION
INNER JOIN DIRPARTYTABLE DTC ON DTC.RECID = CT.PARTY
     AND DTC.PARTITION = CT.PARTITION
WHERE DA.NAME = 'my specific dimension...check the table for a list of values'
GROUP BY DAVSI.DISPLAYVALUE, DT.NAME, CT.ACCOUNTNUM, DTC.NAME, CT.CREDITMAX
ORDER BY DAVSI.DISPLAYVALUE, CT.ACCOUNTNUM