When I was doing the training I was explaining what a Temporary Creditor (or Vendor for the American readers) was and why I hated them so much. As I was talking I was showing that even through the temporary creditor itself had been deleted, the history for it remained.
It was when I tried to open the Payables Transactions navigation list that I got the below error:
Microsoft Dynamics GP[Microsoft][SQL Server Native Client 11.0][SQL Server]Cannot insert the value NULL into 'VENDNAME', table 'tempdb.dbo.##181163', column does not allow nulls. INSERT fails.[Microsoft][SQL Server Native Client 11][SQL Server]Cannot insert the value
This particular client uses the eConnect incoming queue to integrate journals and payables invoices into Dynamics GP from a housing management system.
After installing eConnect and configuring the incoming queue I set about doing a test to ensure it was working.
Unfortunately, it didn’t.
A while ago I came across a bug in the Creditor (or Vendor for my American readers) Maintenance window where when a creditor is deleted the EFT information is not deleted; this has caused problems for a number of clients and I finally decided I needed to do something about it.
The result is a SQL trigger on the Creditor Master (PM00200) table; when a creditor is deleted the trigger runs and deletes all records in the Address Electronic Transfer Funds Master (SY06000) table are then deleted:
CREATE TRIGGER dbo.utr_AZRCRV_DeleteSY06000 ON dbo.PM00200 AFTER Delete AS /* Created by Ian Grieve of azurecurve|Ramblings of a Dynamics GP Consultant (http://www.azurecurve.co.uk) This code is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0 Int). */ DELETE ['Address Electronic Transfer Funds Master'] FROM SY06000 AS ['Address Electronic Transfer Funds Master'] INNER JOIN deleted ON deleted.VENDORID = ['Address Electronic Transfer Funds Master'].VENDORID GO
This the avoids the possibility of a new creditor record being linked to a different creditors bank details. As always with a script (especially one which deletes information, make sure it is tested and you’re happy with how it works before releasing live.
I got a new laptop at work recently (i7 16GB RAM 1080P) and have finally been able to pretty much transition away from VMWare to Hyper-V (which I use at home for my test system); I only have a final couple of VMs to recreate or migrate. It has not all been smooth sailing however, as I have found that if the laptop goes to sleep or is switched off the VMs cannot be started again.
I think this is something to do the AD group policy rather than an inherent problem as my Surface Pro 3 works fine.
The solution is to restart the Hyper-V Virtual Machine Management service, but going into Services every time for this was soon going to get annoying, so instead I created a batch file I can run from a keyboard shortcut containing the following two lines:
net stop vmms net start vmms
These commands stop and then start the service allowing me to use the VMs.
Instead, there is a command you can run at the command line to change the key:
cscript "C:\Program Files (x86)\Microsoft Office\Office15\OSPP.VBS" /inpkey:officekey
Replace the highlighted section above to the new key you want to put in place; once done you can then log into Office and activate the software online.
When I joined Perfect Image they had, unfortunately, standardised on VMware Workstation for virtualisation and I have been stuck using it ever since (we have recently been transitioning to Hyper-V which I intend to start using exclusively once I can get all the VMs either converted or recreated).
I was away for a demo the other week and the night before I tried to launch the demo VM to do some final preparation when I got the following error:
GP Demo - VMware Workstation>/p>
Not enough physical memory is available to power on this virtual machine with its configured settings.
This error message was a baffling one as I had run the VM a couple of days previously with no problems and had made no changes to the laptop in the intervening period.
The fix was a rather odd, yet simple one: launching VMware Workstation using Run as Administrator allowed the application to launch and the VM work without further problem.
Since doing this I have been able to launch the VM at any time under my usual security context.
I had an issue to deal with for a client recently where messages submitted to eConnect were no longer appearing in Microsoft Dynamics GP. I did the usual things of checking the Windows Event Log and ensuring that the two eConnect services were running.
One item I tried was to submit a test message to eConnect which worked fine; so it worked for the perfectimage user I was logged in as, but no-one else. I eventually found the answer in the Properties of the econnect_incoming queue:
The permissions for the perfectimage account (Full) were still configured correctly, but, somehow, the Send Message permission on Everyone had been removed. Once this setting had been added back, people were able to submit messages to the queue without further problem.
Part of the reason I wrote Microsoft Dynamics GP Workflow 2.0 was that I expected the module to be very popular with clients, and so it has proven. It seems a robust and easy to use solution which is a vast improvement on the old SharePoint based Workflow module.
Recently, after upgrading a client to Microsoft Dynamics GP 2015, I was doing a training session with people from their IT and finance departments, or at least I was trying to. However, when trying to select a manager for the Workflow Type we received the following error:
Microsoft Dynamics GP
[Microsoft][SQL Server Native Client 11.0][SQL Server]A .NET Framework error occurred during execution of user-defined routine or aggregate "GetAssignedUsers": System.IO.FileLoadException: Could not load file or assembly...
I switched to the demo system on my laptop to continue the training until we took a short break when I did some investigation. I couldn’t see anything apparent on the client’s system so I did a quick online search.
I quickly found this post where someone had the same error and Microsoftie Jonathan Fear recommended running the following SQL:
Once I had run this in SQL Server Management Studio, the error in Workflow Maintenance went away and we were able to complete the training on the clients system.
We’ve recently started work with a client to upgrade their Microsoft Dynamics GP 2010 R2 system to Microsoft Dynamics GP 2015. When running GP Utilities the following error, which caused the upgrade to fail, was produced:
The conversion process encountered an error and the temporary table did not get removed.
As a first step, we restored the system database and restarted GP Utilities and got the same error again. I did a little exploring of the Company Master table and found that there were entries for companies which did not exist.
After restoring the DYNAMICS database again and running the Clear Companies.sql script from Microsoft, we were able to run GP Utilities without further errors and complete the upgrade.
While implementing Microsoft Dynamics GP for a client earlier this year, we used macros to load Assembly Transactions into Dynamics GP as there is no integration available in Integration Manager. The macros were created on the clients development server, tested and then migrated to the live server.
However, despite the development and live servers having exactly the same Windows and Dynamics GP configurations, the following error was generated on the live server when trying to run the macro:
Microsoft Dynamics GP
Keyword or punctuation expected, but not found. (Line #2)
# DEXVERSION=11.00.0359.000 2 2 CheckActiveWin dictionary 'default' form bmTrxEntry window bmTrxEntry TypeTo field 'TRX ID' , '262307A' MoveTo field 'TRX Date'
The code displayed above is from the macro file; the problem was the key words (such as dictionary) were not capitalised when the macro was recorded and tested on the development server, but needed to be capitalised for the live server. To get the capitalisation correct, I recorded a macro on live and then used the source CSV file to create a new macro by using the newly recorded one as a template for the mail merge.
It did mean that this was not tested the way the previous one was, but we’d proved the method on test and this was deemed acceptable.