Posts Tagged ‘sage 300 erp’
Recently we were investigating why any of the Sage 300 ERP Financial Reporter dialogs would crash when launched from within Excel 2013. It turned out that they were running afoul of Window’s Data Execution Prevention (DEP). DEP is a security feature that has been added to newer operating systems, basically to stop malware programs from figuring out a way download code into a data area and then somehow causing it to execute, usually by overwriting the stack by taking advantage of a memory overrun bug.
OK but Sage 300 ERP would certainly never try to do anything like that, so why would it crash with this sort of exception?
The Sage 300 ERP VB screens are built out of a number of ActiveX controls that provide data binding from Sage 300 Business Objects to the UI elements, so that we don’t have to write any code for most data fields, we just need to wire them up in the screen editor.
When we created these controls as part of creating version 5.0A, there were a number of ways of doing this and the one we chose was Microsoft’s Active Template Library (ATL) where you wrote the controls in C++ in an object oriented manner. And it turns out that ATL puts code into the data segment and then executes it.
So why does ATL do this? The basic problem with object oriented frameworks on Windows is that the core Windows kernel is not object oriented. Basically Windows sends a notification for a Window where the Window is specified by its Windows handle. So how do you know which Window object in your framework should get this notification message? Microsoft’s MFC framework solved this problem by keeping a table of Windows handles to Windows objects, and then when each message comes in, it looks up which object it’s for and then calls that object. This then gave MFC a reputation for being slow, since there are a lot of such messages and MFC then spends all its time looking up objects. But on the good side this is quite a safe and sure method of doing things and has never broken. ATL decided to get tricky. For each Window you can add a custom 32 bit value, so ATL made this a memory pointer to the object code for the object to call. Then when the message comes in ATL would create data for an assembler jump instruction and append this 32 bit address and then pass control to the jump instruction to call the object. Notice that this is done very quickly with no table lookup. But it does mean building a bit of code in data memory and then executing it. Generally this is referred to as “thunking”.
So basically ATL (and early versions of the .Net framework) are executing a design pattern utilized by modern viruses. This is a very clever and fast way to do things, but unfortunately needed to be blocked.
Newer versions of ATL (version 8 and above) now allocate a small block of memory from the operating system with the correct security attributes so that they can still do the same trick, but now the program has let Windows know that this is desired and correct behavior.
Current versions of Sage 300 ERP have their controls compiled using ATL 3.0 which came with the Visual C++ 98 compiler. The correct way to fix the problem is to compile with a later version of the compiler namely we chose Visual Studio 2005 because most other things in our system are compiled with this and it uses ATL 8.1 which then works fine with DEP.
Sound simple. But there are twenty controls or so in the system and there are quite a few differences introduced with newer versions of the C/C++ compiler and with ATL. Generally moving to these newer versions is a good thing, but it introduced a few problems and we needed to ensure the system still worked correctly.
One good thing is that the newer C/C++ compiler has better warnings for detecting things like variables used before they are assigned, bad conversions and mismatched pointers. The compiler detected a few of these and they needed to be fixed. Generally this is a good thing since it makes the overall program more stable and reliable.
Another things with the newer ATL is that it fixed a few bugs in the older ATL. For instance the older ATL didn’t set the background color of controls in all cases, so suddenly if a background color was set and wrong then it would show up, so a few UIs needed to be fixed to set background colors correctly. Generally these are good things, but take a bit of work to correct. They also help with another project we have going to modernize the look of all our UIs.
Then we just have to make sure that our normally supported features like translation to double byte character languages, keyboard shortcuts, design time dialogs and such all still work as expected. This is a bit of a challenge with controls like the field edit control which have a lot of modes of operation.
There is always a lot of debate when we change the build to use a new version of the compiler. Will older programs still work? Will customers with older hardware still work? Is it worth the work and risk in changing things rather than sticking with the trusted and true?
I take the view that we have to allocate time in our releases to address technical debt in our releases. We need to upgrade various compilers, frameworks and bundled libraries. Otherwise we start having problems with newer versions of Windows, with newer hardware and generally operating in modern environments. I think we need to take advantage of bug fixes, security fixes and performance fixes in the tools we are using.
Visual Studio 2012
Once we figured this out, we realized this explained why some ISVs were having trouble integrating to our system from Visual Studio 2012. DEP is now turned on by default for all new projects, which means you will GPF if you use any of our ActiveX visual controls. We then confirmed this was the problem. So when this fix is GA, it should also simplify integration work for our ISVs using modern tools. In the meantime you can set /NXCOMPAT:NO in your project to turn off DEP for your program. Obviously this isn’t ideal, but it is a workaround.
Usually in Windows DEP is only turned on for Windows system processes, but Windows can be configured to turn it on for all processes. However individual programs can be configured for having DEP on or off when they are built. How the program is built will take precedence over the Windows settings. This is why we ran into problems with Excel 2013, since it is compiled with DEP turned on. However Office 2013 is also a development platform, so turning on DEP for Office, also means anything integrated into Office has to be DEP compliant as well. This then eliminates using anything built with older versions of ATL and the .Net framework.
When Will This Be Fixed?
We have fixed this for our upcoming Sage 300 ERP 2014 release (which will be released in 2013). We are currently testing as part of that project, but once we are confident we’ve fixed any minor glitches that are still present then we’ll bundle these updated controls together as a hotfix for Sage 300 ERP 2012.
Finding and solving the problem with our Financial Reporter and Excel 2013, was a bit of a relief since it also explained a number of other problems that had been hanging around unsolved. It’s good to figure out when something has gone wrong and to fix it. It’s also good to know why some developers were having trouble integrating to Sage 300 ERP from VS2012.
In investigating some performance problems being reported on some systems running Sage 300 ERP, it lead down the road to investigating Windows Bit-Rot. Generally Bit-Rot refers to the general degradation of a system over time. Windows has a very bad reputation for Bit-Rot, but what is it? And what can we do about it? Some people go so far as to reformat their hard disk and re-install the operating system every year as a rather severe answer to Bit-Rot.
Windows Bit-Rot is the tendency for a Windows system to get slower and slower over time. Becoming slower to boot, taking longer to log-in, and taking longer to start programs. Along with other symptoms like excessive and continuous hard disk activity when nothing is running.
This blog posting is going to look at a few things that I’ve run into as well as some other background from around the web.
I needed to investigate why on some systems printing Crystal reports was quite slow. This involved software we have written as well as a lot of software from third parties. On my laptop Crystal would print quite slowly the first time and then would print quickly on subsequent times. My computer is used for development and is full of development tools, so the things I found here, might be relevant to myself more than real customers. So how to see what is going on? A really useful program for seeing what is going on is Process Monitor (procmon) from Microsoft (from their SysInternals acquisition). This program will show you every access of the registry, the file system and the network. You can filter the display, in particular you can filter to monitor only a single program to see what it’s doing.
ProcMon yielded some very interesting results.
My first surprise was to see that every entry in HKEY_CLASSES_ROOT was read. On my computer which has had many pieces of software installed, including several versions of Visual Studio, several versions of Crystal Reports and several versions of Sage 300 ERP, the number of classes registered here was huge. OK, but did it take much time? Well the first time something that’s run that does this it seems to take several seconds, then after this its fast probably because the registry ends up cached in memory. It appears that several .Net programs I tried do this. Not sure why, perhaps just .Net wants to know all the classes in the system.
But this does mean that as your system gets older and you install more and more programs (after all why bother un-installing when you have a multi-terabyte hard drive?), starting these programs will get slightly slower and slower. So to me this counts as Bit-Rot.
So what can we do about this? Un-installing unused programs should help, especially if they use a lot of COM classes. Visual Studio being the big one on my system, followed by Crystal and Sage 300. This helps a bit. But there are still a lot of classes there.
Generally I think uninstall programs leave a lots of bits and pieces in the registry. So what to do? Fortunately this is a good stomping ground for utility programs. Microsoft used to have RegClean.exe, Microsoft discontinued support for this program, but you can still find it around the web. A newer and better utility is Ccleaner from Piriform. Fortunately the free version includes a registry cleaner. I ran RegClean.exe first which helped a bit, but then ran Ccleaner and it found quite a bit more to clean up.
Of course there is danger in cleaning your registry, so it’s a use at your own risk type thing (backing up the registry first is a good bet).
At the end of the day all this reduced the first time startup time of a number of program by about 10 seconds.
My second surprise was the number of calls to check Windows Group Policy settings. Group Policy is a rather ad-hoc mechanism added to Windows to allow administrators to control networked computers on their domain. Each group policy is stored in a registry key, and when Windows goes to do an operation controlled by group policy, it reads that registry key to see what it should do. I was surprised at the amount of registry activity that goes on reading and checking group policy settings. Besides annoying users by restricting what they can do on their computer, it appears group policy causes a general high overhead of excessive registry reading in almost every aspect of Windows operation. There is nothing you can do about this, but it appears as Windows goes from version to version, that more and more gets added to this and the overhead gets higher and higher.
You may not think that you install that many programs on your computer, so you shouldn’t have these sort of problems but remember many programs including Windows/Microsoft Update, Adobe Updater and such are regularly installing new programs on your computer. Chances are these programs are leaving behind unused bits of older versions that are cluttering up your file system and your registry.
Related to auto-updates, it appears that so many programs now run as icons in the task bar, install Windows services or install programs to run when you log-in. All of these slow down the time it takes you to boot Windows and to sign-in. Further many of these programs, say like Dropbox, will keep frequently polling their server to see if there are any updates. Microsoft has a good tool Autoruns for Windows which helps you see all the things that are automatically run and help you remove them. Again this can be a bit dangerous as some of them are necessary (perhaps like a trackpad utility).
Similarly it seems that everyone and their mother wants to install browser toolbars. Each one of these will slow down the startup of your browser and use up memory and possibly keep polling a server. Removing/disabling these isn’t hard, but it is a nuisance to have to keep doing this.
Hard Disk Fragmentation
Another common problem is hard drive fragmentation. As your system operates the hard disk becomes more and more fragmented. Windows has a de-frag program that is either scheduled to run when your computer is turned off or you never bother to run it by hand. It is worth de-fragging your hard drive from time to time to speed up access. There are third party de-frag programs, but generally I just use the one that comes built into Windows.
Related to the above problems, often un-installation programs leave odds and ends files around and sometimes it’s worth going into explorer (or a cmd prompt) and deleting folders for un-installed programs. Generally it reduces clutter and speeds up operations like reading all the folders under program files.
Dying Hard Drives
Another common cause of slowness is that as hard drives age, rather than just out right failing, often they will start having to retry reading sectors more. Windows can mark sectors bad and move things around. Hard drives seem to be able to limp along for a while this way before completely failing. I tend to think that if you hear your hard drive resetting itself fairly often then you should replace it. Or when you defrag if you see the number of bad sectors growing, then replace it.
After going through this, I wonder if the people that just reformat their hard drive each year have the right idea? Does the time spent un-installing, registry cleaning, de-fragging just add up to too much? Are you better off just starting clean each year and not worrying about all these maintenance tasks? Especially now that it seems like we replace our computers far less frequently, is Bit-Rot becoming a much worse problem?
Sage 300 ERP has a fairly flexible mechanism for setting up your General Ledger Chart of Accounts. This is a fairly important activity since it controls how you will be able to run financial reports and dice and slice your financial information. I’m not an Accountant so I might miss some of the finer points of accounting, and it’s always important to follow generally accepted accounting principles as much as possible. In some industries and in some countries your chart of accounts is specified for you. For instance Chine specifies the chart of accounts that companies must use.
So the actual chart of accounts can have some fairly hard constraints on how it’s set up. Fortunately there are some other mechanisms like account groups, account rollup, optional fields and transactional optional fields that can be used to enhance reporting capabilities.
I blogged on the general structure of G/L here. This blog posting is going to look a bit more in depth into the structure of G/L Accounts and how to generate some fairly flexible reports.
Each G/L Account can be up to 45 characters in length (formatted). It can consist of up to ten segments each of which can be up to 45 characters. One segment must be designated as the Account segment. Each combination of segments is called an Account Structure.
The account segments are defined in G/L Options on the segments tab and are stored in GLABK (GL0022). These are the building blocks for the G/L Accounts and must be defined first. In Options UI you also specify the segment separator character and which segment is the Account Segment. Next you would define the Account Structures in the G/L Account Structures setup UI and stored in GLABRX (GL0023), these specify the various combinations of account segments that you will be using. You also specify which is the default structure code. The Account Segment isn’t validated and can be any value. The other segment can only have specific values and you specify these in the G/L Segment Codes setup UI and stored in GLASV (GL0021). Then you would define the Account Groups you desire in the G/L Account Groups setup UI.
With these values setup, you can now enter your Chart of Accounts into the Accounts UI in the G/L Accounts folder. The Accounts are stored in GLAMF (GL0001).
If you are following generally accepted accounting principles you should have an idea of how you want your G/L Accounts structured and you should have an idea of how you want your Financial Reports to be structured which then dictates your Account Groups. The Account Groups specify the normal F/R reporting categories like “Cash and Cash Equivalents”, “Accumulated Depreciation” or “Provision for Income Taxes”.
Now that you have a structure, creating all these accounts one by one sounds rather tedious. Fortunately there is G/L Create Accounts function that will create all your Accounts en masse. Our Chart of Accounts isn’t a sparse system, meaning that you do need to create an Account before you use it, so this is a very useful tool.
There are two types of optional fields associated with G/L accounts, one are optional fields that are associated with the actual Account. These are typically used for reports where you are selecting various G/L Accounts to report on and you can then use these optional fields to control which Accounts. For instance the Chart of Accounts report or the Trial Balance report can print Accounts based on the values of these optional fields.
The other sort of optional fields associated with G/L Accounts are transaction detail optional fields. These control information that is flowing from the other ledgers, like A/R Invoices and will store the values for these fields with the transaction details. You can include these in reports like the Transaction or Batch listing reports.
Ultimately the final output of your ERP system are the Financial Reports. Generally CFOs want to look at their Financial Reports from all sorts of angles and all sorts of categorizations. Sage 300 ERP is quite efficient at Financial Reporting since it stores special Financial Set records when it posts batches that keep a lot of data all pre-calculated and easy to access. These are all stored in the GLAFS table (GL0003).
The main Financial Report UI lets you choose all sorts of range criteria and sort orders. These are very useful for getting out your financial statements, but within the Financial Statement specifications there are very powerful data inquiry functions and you can use the full power of Excel to manipulate the data, create charts, create pivot tables, show geographic distributions, etc.
The big drivers of Financial Reports are the Accounts, the Account Groups and the Account segments. However all the functions that you place in your financial reports like FRACCT, FRAMT, FRPOST take filters that can include all sorts of criteria on the fields in the G/L Account record along with optional fields that are specified like A.ACCTCLASS = ʺSalesʺ for Account optional fields and T.QUANTITY <= 0 for transactional optional fields. Keep in mind that account optional fields would be used for filtering accounts and transactional optional fields for filtering transactional amounts (like in FRPOST).
Usually you use the account groups to generate the main F/R statements by the usual accounting categories and then you restrict the account segments to get say departmental or geographic reports. Then you use optional fields to get more esoteric views of your financial data.
To some degree the power of the Financial Reporter depends on your having setup your Chart of Accounts properly in the first place.
But suppose you’ve been running for a while and realize you didn’t setup up things ideally for your financial reporting needs? You’ve now got lots of transactions posted for G/L accounts which you feel are in the wrong structure? Now what do you do? The answer is the GL Account Code Changer module that is included with G/L (you need to activate this separately from G/L). This module will do a search and replace on the database and change all instances of a G/L account from one thing to another. This way all your committed transactions and financial data will be set to the new account and your financial reports can now be printed off the more ideal new structure. The Account number changer can also change the account segment separator and the segments.
Sage 300 ERP has very rich and flexible features allowing companies to create Charts of Accounts that lead to very powerful reporting capabilities. A good knowledge of the whole process is important to design that perfect Chart of Accounts, but if you make a mistake you can always use the Account Number Changer to fix it.
Modern ERP systems maintain a company’s full financial history for many years. People want to be confident that all that data is correct and makes sense. So how can you be confident that your database has full referential integrity? Especially after years and years of operation. The Sage 300 ERP Data Integrity function is a way to validate the integrity of a database. Modern computers are much more reliable these days than when our Data Integrity function was originally written, but it still serves a good purpose. In this article we will explore some of the protections to protect data integrity in Sage 300 along with some of the possible causes of corruption.
The number one protection of data integrity in Sage 300 is database transactioning. Data is always written to the database in a database transaction. Database transactions always take the database from one state with full data integrity to the next state with data integrity. A database transaction is guaranteed by the database server to either be all written to the physical database or none of it is written, the database server guarantees that you will never see part of a transaction.
For instance as we post a G/L batch, we post each entry as a database transaction and since each entry in a G/L batch must be balanced we guarantee via database transactioning that the G/L is always in balance and hence data integrity is maintained.
Where Do Integrity Errors Come From?
Database transactioning sounds great, and in fact with database transactioning we see very few database problems in Sage 300 ERP. But when we do get integrity problems where do data integrity errors come from?
Below is a list of some of the main causes of data integrity problems. I’m sure there are more. I’m not looking to blame anyone (including myself), just to point out the main causes I’ve seen:
- Bugs in the program. If Sage 300 asks SQL Server to store incorrect data, it will do so in a completely reliable transactional manner. Hopefully our QA processes catch most of these and this doesn’t happen often; but, Sage 300 is a large complicated program and mistakes happen.
- People editing database tables directly in SQL Server Enterprise Manager. For various reasons people might try to put something in the database that the program doesn’t allow, and often this leads to database corruption.
- Third party programs that write to the Sage 300 database directly. We do a lot of data validation checking in our business logic before allowing data to be written to the database, but if this is bypassed then corruption occurs. A common one in this case is not handling currency decimal places correctly.
- Data validation needs to be tightened. Now and again, someone has written data that we accepted as valid that wasn’t. Then we had to tighten our data validation routines. The good news here is that we’ve been doing this for a long time now.
- Bug in the Database Server. We’ve seen database indexes get corrupted which can lead to further problems either after the indexes are fixed (because of other data written as a result).
- Partial backups or restores. We’ve seen people back up the tables for each application independently and then restore them. Perhaps to try to put A/R back to yesterday. But this corrupts the database since there is often matching data that needs to be in sync in Bank, Taxes or perhaps Order Entry. Make sure you always backup and restore the database as a whole.
- Hardware glitches. Even with CRC checking and such, strange errors can start to appear from hard disk or memory hardware failures in computers.
The Data Integrity Checker
To find these sort of problems Sage 300 ERP has a data integrity checker in its Administrative Services. The main screen looks like:
You select the applications you want to check and whether you want to fix any minor errors. Since this can be a long process, for several applications you can also configure which parts within the application to check by selecting the application and choosing the application options in a screen like:
The end result is a report is run that lists all the errors found.
What Does the Integrity Checker Look For?
So what does the Integrity Checker do? Below is a list of some of the checks that are typically made:
- Check the integrity of each file reading each record and calling the View Verify API which will call the business logic to validate the record. This includes things like checking the decimals of a money amount are correct, that the data is correct for the data type, that foreign keys are valid.
- For Header/Detail type relationships there are often total or summary fields in the header like the total amount of an order or the number of detail lines. The integrity checker will read through the details and add up any of these numbers to ensure they match the header.
- Check the database for any detail records that don’t have a matching header record (orphans).
- Each application then knows about all sorts of cross file relationships that must be maintained and the Integrity Checker for that application will validate all of these relationships.
What Does Fix Minor Errors Do?
There is the check box to fix minor errors, but what does it do? Mostly it fixes up header/detail relationships by fixing any total or summary fields in header records. It can also delete orphan detail records. But generally it doesn’t attempt much because we don’t want to risk making things worse.
But it’s Slow
The big complaint about the Data Integrity checker is that it’s slow. This is because it does go through every record in the database as well as checking all the cross dependencies. These days we see company databases that are hundreds of gigabytes in size. Generally the complaint is that you can’t just run it as routine maintenance overnight. That you tend to have to configure what you want to run and do that selectively. It’s also best to run it when people aren’t in the system since it does put a fair bit of load on the system.
Even with super reliable modern databases and hardware, data integrity errors can still creep in and need to be dealt with. Just being aware they exist is half the battle. Also remember that it is extremely important to have regular full backups of your data in case of a really catastrophic failure.
As we continue to move Sage 300 ERP to the Azure Cloud, one question that gets asked is whether someone just running G/L, A/P and A/R (the Glapar which rhymes with clapper) is going to be negatively affected by the presence of say I/C, O/E and P/O? Fortunately, Sage 300 ERP activates each module independently and unless an accounting module is activated in the database, you don’t see it at all, it’s just as if you hadn’t installed it.
With per user pricing we’ve tended to bundle quite large number of modules under our various pricing schemes. However if you get such a bundle and then activate everything you have, you could enable quite a few fields and icons that clutter things up, which is a nuisance if you never use them. Generally business flows better if you only see icons and fields that you actually use. Why keep seeing currency rate fields when you never select a different currency? Why see selections for things like lots and serial numbers when you don’t use these? Why see project and job costing icons when you don’t use this module?
Using security and the built in form customization abilities can be used to hide complexity as well. However if the feature is enabled, it usually implies that someone in your organization is going to have to deal with it. So consider these in addition to setting up security and setting up customizations for your users.
In this article, I’m going to go through the process of activating applications and provide some behind the scenes info on the various processes around these issues. A slightly related article is my posting on Sage 300’s multi-version support.
To access a module, it first has to be installed. Generally from the installation DVD image, you can select (or de-select) most modules. There are some dependencies, so if you install Purchase Orders then that implies you need a number of other accounting modules to be installed as well. Each accounting module gets its own folder under the Sage 300 installation folders. These are formed by a two character prefix like GL or PO followed by a three letter version like 61A (not the year based display version). Generally all accounting applications are created equally and the Sage 300 System Manager becomes aware of them by the presence of these folders and then gathers information on the application by looking for standard files stored in these folders (like the roto and group files).
When you create a new company in Sage 300, the only applications that are there by default are Administrative Services and Common Services. Below is the data activation screen:
This program lets you choose which applications to activate into the database from the list of all installed accounting modules. When you select a given module, you may need to specify a few extra parameters in the next screen. The program will also tell you about any dependencies that are required and select these for you. Then when it goes to activate the programs it call the activation object in each selected application to let the application do whatever is required to add it to the system. This usually involves creating all the database tables for the application along with seeding any data that is setup automatically (like perhaps a default options record). If you are upgrading to a new version it will do whatever is required to convert the data from the old version to the new.
You can run this program as many times as you like, so if you don’t activate something, you can always come back later and activate it then. Just keep in mind that after you activate something, you can’t de-activate it. We do put up a fairly strong message to ensure you back up your database before running data activation. Not all database conversions can be transactioned, so if data activation does fail, you may need to start over from a backup, though often you can fix the problem and run activation again to finish.
For our hosted versions, you don’t need to install anything and you don’t actually see the data activation screen, you select what you want from a web site and then the database is provisioned for you. For the on-premise version, installation and activation is usually performed by the business partner.
If you just activate General Ledger, then you will only see General Ledger on the desktop and won’t see icons from anything else that is installed.
Also notice that “Create Revaluation Batch” isn’t shown because I haven’t enabled multi-currency for this database.
Other Separate Features
Some modules like multi-currency, serialized inventory, lot tracking and optional fields aren’t installed via data activation. The database support for these modules is always present. To be able to use these you need to install the license for the module and then you can enable the functionality within the other applications. For instance to turn on multicurrency you need to enable this in the Company Profile screen in Common Services.
Until you do this, all the fields, functions and icons for these will be hidden and won’t clutter up your desktop or entry forms. So if you don’t really need these, then don’t turn them on. Also keep in mind that once you enable these features, you can’t turn them off again, they are turned on permanently.
In one regard Sample Data is a bad example, since it has everything possible activated and enabled. Since it comes this way, applications will be activated even if they aren’t installed. This sometimes causes funny problems because some functions that communicate between modules won’t work in this case.
Sample data is a great way to show any feature in Sage 300 ERP, but in one regard it’s rather misleading. It tends to always be run as the ADMIN user and hence always shows all possible icons, fields, and functions. This tends to make the product look much more complicated that it really is in real world usage. In the real world, you wouldn’t activate things you don’t need and in the real world the user wouldn’t have security access to everything so again many things would be hidden and their workspace simplified.
We don’t normally allow deactivating an application or turning off a feature like multi-currency. The reason is that data integrity problems could theoretically occur if you do, since for instance if you have processed payroll checks, then bank would need payroll present to reconcile those checks and if you deactivate payroll while there are un-cleared checks in bank, then you will never be able to reconcile these checks. There are many cases like this so as a general good practice protection we prevent de-activation.
But the developers in the audience will know there is a back door. In the Sage 300 ERP SDK there is a “Deactivate” program that will deactivate an activated application. It does this by dropping all the tables for the application from the database and removing its entry from CSAPP. It does not do any cleanup of data that might be in any other accounting application’s tables. This is a great tool while developing a vertical accounting module for Sage 300, but if you use this on an production system, really be confident that the offending application hasn’t been used and you aren’t going to leave corrupted data in all the other modules as a result of removing this one. Again backup before proceeding. Similarly turning off things like multi-currency by editing the CSCOM table in SQL Enterprise Manager has the same caveats.
Generally you want to keep your accounting system as simple as possible. Modular ERP systems like Sage 300 ERP have a great breadth of functionality, but most companies only need a subset of that, which is relevant for their industry. So be careful to only select what you need and keep your system a little simpler and easier to use.
There was a discussion on LinkedIn the other day, that was started since the latest version of Sage 100 ERP only allows one copy of itself to be installed on a given computer. Many programs operation this way such as most Microsoft products and other Sage products like Sage 300 ERP. The main reason for this is to avoid confusion for users when they are using integration technologies like COM or .Net. Since then it’s easy to know what you are talking to when you integrate from another program. This is also how the Windows Installer works, so if you want to use this technology then this is what you get.
But the topic came up as to what to do to support multiple customers? The answer given was to use virtualization. We use this fairly extensively here at Sage for Development, QA and Support. This blog posting is to cover a bit more fully our uses of virtualization and some of the things we have discovered along the way.
The Sage 100 and Sage X3 groups use Oracle VirtualBox. This one is nice because it’s open source (Oracle acquired it as part of Sun). I’ve run VMs created with this, but never created one myself or have too much experience with it.
The Sage 300 team uses VMWare. It used to be that you could use the VMWare player for free, but now it is only free for non-commercial use, but at least it’s fairly cheap. Generally you only need the Player and not the Workstation version. One nice feature is the unity feature which does an amazing job of integrating the virtual environment with your desktop environment which is good for demo purposes.
For server based VMs we use VMWare because our experience is that the memory usage is much better than the Microsoft Windows Server versions (but I haven’t played with Windows Server 2012 yet). The MS Server ones tend to force a lot of locked memory and you can’t run as many VMs. Our support department keeps a library of all supported operating systems times all supported versions installed, so if a client problem comes up say running XX version 3 on Windows XP 32-bit, then we boot up the right VM and try to reproduce the customer’s problem.
Generally we find it useful to create a base operating system image like Windows 7 (64-bit) and keep a clean copy that we update every now and then with Windows updates. Then when we want a VM we just get a copy of the base operating system and install what we want on top of it. (We also keep some images of popular operating systems with office and SQL Server as a better starting point). Generally to give a quick way to get running when a need arises.
We used to use MS VirtualPC a lot, but have moved away from it because MS doesn’t seem to be updating it anymore and it doesn’t support 64-bit client operating systems. This one is included with MSDN subscriptions, so it you have one of these, you probably have access to it.
It seems Microsoft is repurposing its VirtualPC software to their XP Mode feature to allow you to run Windows XP only software easily on Windows 7.
Client Operating System Licenses
Generally all the developers at Sage have an MSDN Universal subscriptions so this gives us the licensing to do what we need with the client operating systems. But for most development partners, there is a lot of benefit in having an MSDN subscription yourselves.
One disadvantage of virtual machines in the past has been how large they are (usually around 32Gig). This uses up disk space fast, but with cheap 3TB hard drives, this doesn’t seem to be much of a problem anymore.
I’ve found the main thing you need for good performance in virtual environments is lots of memory. If your computer has 8Gig RAM then you can allocate 4Gig to the VM and still have 4Gig for your base operating system. Even though I find frequently switching back and forth between things in the VM and things in the base operating system can be slow, so I like to work for longer periods in on or the other.
Also quite a few laptops have hardware virtualization support turned off by default, going into the BIOS setup and turning this on can speed up VMs quite a bit.
To me virtualization software is quite amazing. I’m astounded that I can just run Windows 8 or Linux easily on my Windows 7 laptop. I think virtualization software has come a long way and is still progressing quickly. If you haven’t tried it out recently and you need to keep things separated, then you really should try one of these out. It saves a lot of headaches not having to worry about the installation of one thing messing up something else you have installed.
With the upcoming Product Update 1 for Sage 300 ERP 2012 we will be releasing new versions of Canadian and US Payroll. These now have the new names of Sage HRMS Canadian Payroll 2012 and Sage HRMS US Payroll 2012. These will be separate installations that you need to download along with the main Product Update 1. The new name is to reflect that this is the common Sage Payroll used in several Sage products (including Sage HRMS), but since this is based on Sage 300 ERP Payroll, for Sage 300 ERP users this will just appear as a new version. The benefit of sharing the Sage 300 Payroll is to pool some development resources to get further product enhancements incorporated to the core product.
There are quite a few new features across this offering, so I’ll quickly list them whether they apply to both Payrolls or just Canadian or just US.
Payroll Tax Number: A new Tax Number field has been added to the Company tab on the Payroll Setup > Options window. You can assign a tax number for each company or database.
Common Deduction Limits: You can use this screen to associate multiple deductions based on a single limit and ensure that employee deductions do not exceed defined annual maximums. Multiple deductions can now be combined to allow a single annual maximum to be applied, removing the need for manual calculation of group limits for multiple deductions such as union dues, garnishments, and retirement plans. And of course there is a matching report.
General Ledger Accounts for Workers’ Compensation Groups: Employer expense and liability accounts from General Ledger can now be associated to a workers’ compensation group so that the expense and liability codes associated with that group post in the General Ledger batch file. With this enhancement, a complete picture is provided of the employer cost associated with each check run.
Workers’ Compensation Codes Modifications: Allow assignment of a G/L account to a Worker’s Compensation Group, and include:
Expense Account – The General Ledger expense account posts the employers’ portion of workers compensation.
Liability Account – The General Ledger liability account posts the employers’ portion of workers’ compensation.
Workers’ Compensation Transaction Reports: A new report format, G/L Transactions, lists the applicable assessment amounts of employee earnings and the amount of the General Ledger transaction, as well as the associated expense and liability accounts. This report also helps identify check details with missing G/L account associations. The Retrieve Assessments from Payroll Register option on the Workers’ Compensation report window allows processing of four new transaction reports. Existing reports have not been changed.
Transaction History: With the introduction of workers’ compensation calculations, you can now enter any adjustments for workers’ compensation using the Transaction History window. The Transaction History window now contains additional fields needed for adjustments to workers’ compensation; the Base, Rate, and Assessment.
The Transaction History Report has been modified to include workers’ compensation information, applicable both as a total and on the employee level, detailing the employee’s earnings and the associated assessment amounts, bases, rates, and workers compensation groups/codes.
Cheque Printing Security Enhancements: Sage HRMS Payroll 2012 has been enhanced to improve security measures during the Cheque Printing and Posting process. With this update, users with security permissions that limit access to certain employees will no longer be able to print cheques for employees to which access is restricted.
Pre-Cheque Payroll Register and Payroll Registers: The Pre-Cheque Payroll Register and Payroll Register have both been modified to include workers’ compensation information when applicable, both as a total and on the employee level, detailing the employee’s earnings or hours and their associated assessment amounts, bases, rates, and group codes.
Employer Match Calculations for Timecard and Manual Check Entry: Two new fields are now available in Sage HRMS Payroll on the Timecards and Manual Checks windows. The new fields are available to allow you to temporarily change the employer contributions to the deductions during the Timecard or Manual Check entry process.
Expanded Cost Center Override: You can now define the cost center override for deductions with the Flat calculation method setting in the Earnings and Deductions setup window. Select the Cost Center Override Allocated Based on Calc Base check box on the Basic Info tab, and complete the Calc Base tab to apply the overrides. After an earning is associated with a flat deduction type, the deduction will then be allocated based on the associated earnings allocation.
International ACH Transfer (IAT): The new NACHA International ACH Transaction (IAT) rules require each transaction to be classified as foreign or domestic, allowing the depository financial institutions to identify whether the transactions meet their review requirements as outlined by the US Treasury Department. The IAT rules exist to identify the financial institution responsible for moving the funds rather than the originator or receiver of the funds. Any financial institution outside the territorial jurisdiction of the US that is involved in a payment transaction where the institution is moving the funds from the originator or receiver are required to transmit the transaction using the new IAT format.
Employee Work Location: A new tab, Work Location, was added to the Payroll Employees > Employees record window. On this tab, you can enter address information about an employee’s work location.
US Only Features
Sage Payroll Tax Forms and eFiling by Aatrix: This functionality provides updated reports and forms at no cost and enables you to fill out and file tax forms electronically for a small fee; including W-2, W-3, U.S. federal forms (941, 943, 944, and 945), U.S. state tax forms, and new hire reporting by state. The first time you sign in to Sage 300 ERP after installing Sage HRMS Payroll, Sage 300 ERP uses a secure connection to automatically register your company with Sage Payroll Tax Forms and eFiling by Aatrix.
Assign Tax to Employees: Instead of assigning taxes to one employee at a time, you can now assign a tax to multiple employees in a single operation. Selection criteria can be used to assign taxes to groups; for example, a state tax can be applied only to employees who live in a selected state.
The following new windows and reports have been added, moved, and consolidated to support Sage Payroll Tax Forms and eFiling by Aatrix:
- New Federal Tax Filing window: You can use this new window to fill out federal tax forms for the purpose of printing or e-filing them.
- New State Tax Filing window: You can use this new window to fill out state tax forms for the purpose of printing or e-filing them.
- Moved 941 window: The 941 window has been removed from Government Reports, and now appears in the list of forms on the Federal Tax Filing window.
- Consolidated W-2 Tax Filing window: The W-2s on Paper and W-2s on Disk(ette) windows have been replaced with a single W-2 Tax Filing window.
Child Support EFT: Sage HRMS Payroll has been enhanced to generate federally compliant Child Support EFT files for all states that require electronic submission to a State Disbursement Unit (SDU). A new tab, Garnishment, is available on the Payroll Employees > Employees record, which allows entry of data pertaining to a specific employee’s garnishment. The data entered on the Garnishment tab is used in the creation of EFT files using Generate EFT File while meeting Federal and State guidelines.
US Citizen Field: A new US Citizen field was added to the Class/Schd tab of the Payroll Employees > Employees record window. Selection of this option designates the employee as a US citizen.
Canadian Only Features
Support for Multiple QEIR and EIR Numbers per Employee: You can now enter both federal (EIR) and Quebec (QEIR) Employment Insurance rates for employees who are working between Quebec and other provinces. Sage HRMS Payroll calculates EIR if the Province of Employment field is set to provinces other than Quebec in the Employee record, and calculates QEIR if the Province of Employment field is set to Quebec. You will no longer have to add or remove EIR or QEIR depending on the employee’s work location.
Updated EFT Format For HSBC Bank: A new Bank format for HSBC was added in the Payroll Setup > EFT Options window.
People who are familiar with the internals of Sage 300 ERP may puzzle a bit about the versioning of this product. The official external version is 2012 like the rest of Sage 300 ERP 2012. However if you dig into the internal version numbers you will find that the main products are version 7.0A and then the Canadian tax tables are 7.0D and the US tax tables are 7.0G. So why is the version 7 when the internal version of everything else is 6.1A? The reason is that Sage HRMS Payroll is now the Payroll of several other Sage products and has its own versioning. Similarly why don’t the tax tables start at 7.0A, again the reason is that the earlier letters have already shipped with other products and that is where we are jumping in.
As you can see from all the features added and changes made that this is a fairly substantial update to the Payroll modules for Sage 300 ERP. Now that this is the standard Payroll module across several Sage products we should see more good progress. Certainly more than you would usually see associated with a Product Update.
We released Sage 300 ERP 2012 back in September, 2012 and now are preparing to release our Product Update 1. This PU1 will have a number of bug fixes along with a number of new features. In this blog article I’m going to explore one of the new features that allows you to choose the color of the title bar and border on your Sage 300 Windows by company.
Many of our users have data stored in multiple Sage 300 databases. Often they are comparing this data or copying it from one company to the other. Often they end up with several desktops open along with many screens. The company name is always part of the title bar, but this is quite subtle and it’s easy to do tasks in the wrong company as a result. This feature makes it more obvious which window belongs to which company with a stronger visual indicator, namely the color of the title bar and window boarder.
You configure the color of the company using Database Setup. The list of companies now includes a column for the company’s color. There is an option you can click to automatically assign colors to all your companies. Or you can clear all the colors and go back to no colors.
If you want to select each companies color yourself then you do this by editing the individual company.
Note that when a color is chosen we take over drawing the title bar and window border to draw the color. If you want the natural Window look then don’t select a color and we won’t touch the title bar or border allowing Windows to do its default drawing.
Once configured you will get Desktops and screens looking like:
Notice that even the little preview windows when you hover over the taskbar icons show the colors.
Windows doesn’t provide any easy API for coloring the title bar or border of a Window. This area is referred to as the non-client area and the best it gives you is a hook to take over painting the entire area. This means you have to also paint the buttons for minimize, maximize and close as well as the system menu and draw the various states of these. Plus you have to then draw the title text and handle whether to use white or black depending on the color selected.
If you’ve ever run the Chrome or Firefox browsers, you may have noticed that they support themes which among other things take over the drawing of the non-client area. The neat thing then is that since these are open source projects, you can download their source code and see how they do it. The main thing for us was that we got confirmation that there wasn’t an easier way to do it and that what we were doing was a reasonable approach.
For this exercise we wanted to get the right balance of visual difference to not causing too jarring a change. We considered changing things like the form background, but in our opinion this was too drastic and didn’t work well with all color combinations. This approach also doesn’t interfere with the various Windows accessibility options so you can still use high contrast modes and such.
The way we implemented the change was to put the code into the regular desktop, as well as the a4wcontainer. The a4wcontainer acts as a COM host to host our screen ActiveX controls. Each accounting screen is written in Visual Basic and compiled as an ActiveX control. This way it can be embedded in other programs and customized via the API it exposes. But when running normally, when the desktop is asked to run a VB form, it launches a4wcontainer and that then acts as a COM container for hosting the ActiveX control. It is actually the COM container that is responsible for drawing the non-client area, so all we had to do is add the drawing code here to then affect all VB screens.
At this point frequent screen customizers might object. Unless you are an SDK application you won’t get this treatment. If you have embedded a screen in another programming system (say VB) and are distributing it as an EXE then you are the COM container and won’t get the benefit of this treatment. Similarly if you run from VBA, we have no control of drawing the VBA non-client area so again you won’t get this treatment. But our feeling was that the current implementation should handle most of the need and a few exceptions was ok.
Our first Product Update for Sage 300 ERP 2012 should be out shortly and this article gives a quick overview of one of the usability features that has been added. Hopefully people find this useful and it will help them perform their work more accurately.
Sage 300 ERP has had a cloud version for over ten years now with Sage 300 Online. I blogged a bit about how this offering works here. Basically we offer our on-premise product in the cloud relying on Citrix and Terminal Services technologies to host and allow access. You are basically running your on-premise Windows desktop ERP application in the cloud. The only thing required on the local computer to access the software is the Citrix client component which is available for Window, Mac, iPad, etc. We are currently hosting this in Sage’s Atlanta data center.
We are now looking to produce the next generation of this cloud offering and we are looking to host it in Microsoft Azure rather than our own data center. There are quite a lot of reasons for this, like being able to deploy globally to various Azure datacenters or to take advantage of the Microsoft network which provides quite low latency access to these data centers. But the Azure feature we are going to explore in this blog posting is what does Azure PaaS give us over an IaaS offering? What are the differences and what are the considerations?
IaaS stands for Infrastructure as a Service. In this model the provider, typically someone like AWS or Rackspace, provides all the hardware, storage and networking equipment. Then they give you the ability to create virtual machines in this environment (often providing some starting templates of typical servers). It is then up to you to install everything you need to run on these virtual machines and to maintain them fully going forwards, ensuring all patches are installed, upgrading to new versions of the operating system, etc. If you need a database server then it’s up to you to buy and install that database server on one of your virtual machines and it is up to you to maintain this server, back it up, provide geographic redundancy and anything else that is required to keep the data safe.
PaaS stands for Platform as a Service. The goal of PaaS is to build on IaaS to have the vendor also take care of maintaining the operating system and the database server. So you don’t need to worry about patches, backing things up and various other chores. Microsoft Azure is an example of a PaaS offering (although they are getting into offering IaaS as well). Below is a diagram showing how the layers are often separated. This also gives the vendor the opportunity to provide more advanced monitoring and management tools for the platforms they are supporting.
Complexities of PaaS
That all sounds good but how do you separate out the maintenance of the operating system from the maintenance of the application? Generally the way this is done is that whenever you start a VM you start with a fresh copy of the operating system which is then guaranteed to be at the most recent patch level. In the IaaS model you start with the operating system template once, then everything that happens to that image is stored to disk and if the image is stopped and started you continue from where you left off. In the PaaS model if your image is stopped and started, then it is restarted with a fresh image of the operating system and you need to install all your software before it can start processing.
This is why most PaaS systems tend to be very specialized in the type of applications they can run. Azure is built around running ASP.Net applications. Microsoft provides support in Azure and .Net to easily deploy your ASP.Net application to a new operating system image as it starts up. Similarly Ruby on Rails PaaS vendors provide support for installing the Rails application on their new image as it starts.
But doesn’t that mean you have to transfer huge amounts of files to the environment every time a VM starts? Well actually no. Azure provides a number of non-VM storage mechanisms for storing files in the environment. So for instance you can use blob storage or one of the other storage mechanisms to hold all your files and data. Then the scripts that start the VM would just attach these to the VM when it starts, or your program knows how to use the Azure SDK to connect to these when it runs.
PaaS databases are bit easier, since you just create and access your SQL Server databases in Azure SQL just like you would in a regular SQL Server, you just need to know the right information for the connection string to the Azure SQL server. But there are still issues to be aware of. For instance when you create your own set of VMs to make a production system, you can specify them in a VM group and ensure they all run in the same area of the Azure data center. This then ensures any latency of communication between these VMs is very low. But when you switch to the PaaS database server, all the servers that comprise, say Azure SQL all run together in another area of the data center and the latency from your server group to this server group is a bit higher than communications within your own group. Further you are then sharing these servers with other unrelated applications using these servers, so you are relying on the vendor to provide sufficient processing power to keep everything performant. But the best thing about PaaS databases is that you don’t need to worry about tuning the server or backing up all the data.
Moving Sage 300 ERP to PaaS
But Sage 300 ERP is not an ASP.Net application. So how can we even run in this environment? Fortunately, Microsoft provides both IaaS and PaaS services in the Azure environment. This means we can move our current Sage 300 ERP cloud implementation from our Atlanta datacenter to Azure by using the Azure IaaS infrastructure. We can then start to take advantage of various PaaS services one by one.
The whole Azure infrastructure provides a very rich API which allows you to use PowerShell scripts to control what is going on. Hence the ASP.Net support is really a set of scripts developed to handle typical deployments that is built into the Azure/Visual Studio SDK. From Sage 300 ERP we are developing similar scripts to create and control a Sage 300 ERP deployment. This way we can incorporate PaaS building block into our system. So we can have scripts to start and configure the whole system, scripts to add applications servers and so forth. We can even integrate these into standard DevOps tools such as RightScale or Chef.
Leveraging Azure SQL is perhaps a bit easier since you can use the same access technology we currently use. We have to add a bit of code to our database driver to handle some cases that can happen in the Azure environment but generally using Azure SQL is very similar to using regular SQL.
The system that makes up Sage 300 ERP cloud consists of quite a few VMs all performing various tasks. Switching some of these to PaaS based servers is very straight forward, so we can do this right away. For our application servers that have full Sage 300 ERP installed, we will continue to maintain these as IaaS for the foreseeable future since right now the overhead in installing full Sage 300 ERP is a bit high. But there are features coming in the Azure roadmap which will over time let us migrate these to PaaS as well. To some degree PaaS is still in its early stages and new services are regularly being rolled out and we can take advantage of these new services as they appear.
In our Atlanta data center we operate Sage 300 Cloud on a fixed set of servers that we purchased and as the usage grows we buy another server now and again. With a managed service like Azure we are paying for our infrastructure on a usage basis. However the system is elastic, meaning that we only need to run as many application servers as we need to handle the current load. So if usage is low during the weekend then we can shut down some application servers and save some money. Similarly if we get a big spike in usage we can easily add as many application servers as we need instantly to handle the load (perhaps at year end). Again basically we control all of this with a number of PowerShell Scripts.
Microsoft Azure is a very rich cloud environment. It provides us with a lot of benefits to help us beef up our Sage 300 ERP Cloud offering. But with the richness, there is quite a bit of work creating scripts to take advantage of it, tuning our application for this environment, as well as a learning curve learning the best ways to leverage all these new PaaS features.