Recently we were investigating why any of the Sage 300 ERP Financial Reporter dialogs would crash when launched from within Excel 2013. It turned out that they were running afoul of Window’s Data Execution Prevention (DEP). DEP is a security feature that has been added to newer operating systems, basically to stop malware programs from figuring out a way download code into a data area and then somehow causing it to execute, usually by overwriting the stack by taking advantage of a memory overrun bug.
OK but Sage 300 ERP would certainly never try to do anything like that, so why would it crash with this sort of exception?
The Sage 300 ERP VB screens are built out of a number of ActiveX controls that provide data binding from Sage 300 Business Objects to the UI elements, so that we don’t have to write any code for most data fields, we just need to wire them up in the screen editor.
When we created these controls as part of creating version 5.0A, there were a number of ways of doing this and the one we chose was Microsoft’s Active Template Library (ATL) where you wrote the controls in C++ in an object oriented manner. And it turns out that ATL puts code into the data segment and then executes it.
So why does ATL do this? The basic problem with object oriented frameworks on Windows is that the core Windows kernel is not object oriented. Basically Windows sends a notification for a Window where the Window is specified by its Windows handle. So how do you know which Window object in your framework should get this notification message? Microsoft’s MFC framework solved this problem by keeping a table of Windows handles to Windows objects, and then when each message comes in, it looks up which object it’s for and then calls that object. This then gave MFC a reputation for being slow, since there are a lot of such messages and MFC then spends all its time looking up objects. But on the good side this is quite a safe and sure method of doing things and has never broken. ATL decided to get tricky. For each Window you can add a custom 32 bit value, so ATL made this a memory pointer to the object code for the object to call. Then when the message comes in ATL would create data for an assembler jump instruction and append this 32 bit address and then pass control to the jump instruction to call the object. Notice that this is done very quickly with no table lookup. But it does mean building a bit of code in data memory and then executing it. Generally this is referred to as “thunking”.
So basically ATL (and early versions of the .Net framework) are executing a design pattern utilized by modern viruses. This is a very clever and fast way to do things, but unfortunately needed to be blocked.
Newer versions of ATL (version 8 and above) now allocate a small block of memory from the operating system with the correct security attributes so that they can still do the same trick, but now the program has let Windows know that this is desired and correct behavior.
Current versions of Sage 300 ERP have their controls compiled using ATL 3.0 which came with the Visual C++ 98 compiler. The correct way to fix the problem is to compile with a later version of the compiler namely we chose Visual Studio 2005 because most other things in our system are compiled with this and it uses ATL 8.1 which then works fine with DEP.
Sound simple. But there are twenty controls or so in the system and there are quite a few differences introduced with newer versions of the C/C++ compiler and with ATL. Generally moving to these newer versions is a good thing, but it introduced a few problems and we needed to ensure the system still worked correctly.
One good thing is that the newer C/C++ compiler has better warnings for detecting things like variables used before they are assigned, bad conversions and mismatched pointers. The compiler detected a few of these and they needed to be fixed. Generally this is a good thing since it makes the overall program more stable and reliable.
Another things with the newer ATL is that it fixed a few bugs in the older ATL. For instance the older ATL didn’t set the background color of controls in all cases, so suddenly if a background color was set and wrong then it would show up, so a few UIs needed to be fixed to set background colors correctly. Generally these are good things, but take a bit of work to correct. They also help with another project we have going to modernize the look of all our UIs.
Then we just have to make sure that our normally supported features like translation to double byte character languages, keyboard shortcuts, design time dialogs and such all still work as expected. This is a bit of a challenge with controls like the field edit control which have a lot of modes of operation.
There is always a lot of debate when we change the build to use a new version of the compiler. Will older programs still work? Will customers with older hardware still work? Is it worth the work and risk in changing things rather than sticking with the trusted and true?
I take the view that we have to allocate time in our releases to address technical debt in our releases. We need to upgrade various compilers, frameworks and bundled libraries. Otherwise we start having problems with newer versions of Windows, with newer hardware and generally operating in modern environments. I think we need to take advantage of bug fixes, security fixes and performance fixes in the tools we are using.
Visual Studio 2012
Once we figured this out, we realized this explained why some ISVs were having trouble integrating to our system from Visual Studio 2012. DEP is now turned on by default for all new projects, which means you will GPF if you use any of our ActiveX visual controls. We then confirmed this was the problem. So when this fix is GA, it should also simplify integration work for our ISVs using modern tools. In the meantime you can set /NXCOMPAT:NO in your project to turn off DEP for your program. Obviously this isn’t ideal, but it is a workaround.
Usually in Windows DEP is only turned on for Windows system processes, but Windows can be configured to turn it on for all processes. However individual programs can be configured for having DEP on or off when they are built. How the program is built will take precedence over the Windows settings. This is why we ran into problems with Excel 2013, since it is compiled with DEP turned on. However Office 2013 is also a development platform, so turning on DEP for Office, also means anything integrated into Office has to be DEP compliant as well. This then eliminates using anything built with older versions of ATL and the .Net framework.
When Will This Be Fixed?
We have fixed this for our upcoming Sage 300 ERP 2014 release (which will be released in 2013). We are currently testing as part of that project, but once we are confident we’ve fixed any minor glitches that are still present then we’ll bundle these updated controls together as a hotfix for Sage 300 ERP 2012.
Finding and solving the problem with our Financial Reporter and Excel 2013, was a bit of a relief since it also explained a number of other problems that had been hanging around unsolved. It’s good to figure out when something has gone wrong and to fix it. It’s also good to know why some developers were having trouble integrating to Sage 300 ERP from VS2012.
Today, I’m not going to talk about technology or ERP. Instead I’m going to talk about one of my other loves, namely running. I started running back when I was 35 years old. For most of my life before that I couldn’t put on weight no matter how hard I tried, I was perpetually extremely skinny. Then I turned 35 and things changed. I started to put on weight quickly, my only exercise was various weekend warrior type activities. As a result I ended up hurting my back during some ice skating and ended up immoveable for several days. As part of my recovery I went to physical therapy where they basically blamed everything on my lack of physical fitness.
Once I recovered from my back injury, I pledged to start exercising regularly. At that time I lived in Tsawwassen and several of my neighbors were avid runners. So I started getting up early and going for a run every second morning along the dyke. Once I got going, I joined the Tsawwassen running club the Bayside Striders. After that I worked my way up running in 5K runs then 10K runs then half marathons and eventually full marathons.
I found that only running and running the long distances required for marathon training was too hard on my body and I would frequently get injured. So after running four marathons I took up triathlon which then mixes up the training between swimming, running and cycling. Since doing that I haven’t had a sports related injury (knock wood). I worked my way up to participating in the Victoria Half-Ironman race. Then since then I’ve continued training, but only participated in a few running races including the Sun Run and the Terry Fox Run.
I’ve found running and triathlon are great ways to stay in shape. But for the last year or so I’ve been letting other priorities get in the way of my regular training. As a result I’ve gained back some weight and ran my slowest Sun Run since my first year participating. I think part of that is that I haven’t been entering races. I think unless you have a goal it’s just too easy to let other things take precedence.
My wife, Cathalynn, has been afflicted with quite bad psoriatic arthritis for the past few years and volunteers for the Mary Pack Arthritis Program. She was attending a party thrown for volunteers and found out that the Arthritis Society was participating in the Scotiabank Vancouver Half Marathon Charity Challenge. I found out about this just as I finished the Sun Run and realized that I would have two months to train and bring my running distance up from 10k to 21km for the Half Marathon distance. This seemed like an ideal goal. Improve my fitness by increasing my running distance and raising money for a charity I really believe in. So I set this as my goal, started fund raising and started increasing my long runs by 2km each week.
Cathalynn has written a guest piece for this blog on her Arthritis experience here.
Setting a goal has definitely made a difference. Rather than just skipping a midweek run just because I have a busy day or too many meetings, now I find a way to get that run in. Similarly now my long run becomes much more important on the weekend.
Running for Charity
Running for charity is a great way to combine a running goal with giving back to the community. Fundraising is easy now with the Web, whenever you sign up for one of these, a web site gets created that people can contribute to.
So if you want to check out my site, its at:
and please sponsor me for this race. It’s a great cause.
Garmin Smart Watch
Well, I guess I can’t really blog entirely without mentioning technology. When I run I use my Garmin GPS watch which records my route and all sorts of running data along with it. So for instance you can see the data for my 14km long run last weekend here. It also has a heart rate monitor and will record your heart rata data as well (though I didn’t wear it on the run in the link). There is a lot of talk about Apple coming out with a smart watch, but I tend to think that Garmin has been doing this successfully for quite some time now. Incidentally Cathalynn gave me this watch one Christmas, so thanks for the great gift.
Running is a great way to stay healthy. To reduce weight and increase cardiovascular health. The main obstacle is usually fitting it into our busy schedules. Setting goals like big races is a great way to provide motivation and to give our running priority. Running for a charity is a great way to make it personal and to provides a great way to give back to the community.
This is a guest blog posting by my wife Cathalynn Labonté-Smith on her experience with Arthritis. You can sponsor me in my run here.
I sat on a denim loveseat drinking blueberry punch while flat bluebirds and delicate azure flowers peered down at me from the ceiling. No, it wasn’t an Alice in Wonderland-type daydream; I was at the Mary Pack Arthritis Centre Volunteer Tea in celebration of their 75th Anniversary. As a patient, sporadic contributing writer and volunteer, my interest perked up when I heard there were 10 free entries to the ScotiaBank Half-Marathon and 5K Run for anyone willing to raise funds for the Arthritis Society.
Steve was looking for a race goal he could manage around all those business trips. He loves meeting all of you readers and has a severe case of travelophilia. However; it does make it hard for him to fit in triathlon training into his busy schedule, but a half marathon when he was just coming off the annual Vancouver Sun 10K Run sounded perfecto.
I did that race in my pre-arthritis days. I envisioned a brisk downhill pace from the top of the University of British Columbia campus to the Lumberman’s Arch in Stanley Park. I didn’t expect a heat wave so I left behind my hat as did many other runners who succumbed to heat stroke that day. Being fair-haired and freckle-faced I don’t take to heat well and knew I was far off even my usual snail’s pace when the walkers started passing me. Finally, the ambulance crew on bikes pulled up to me and asked if they could have their picture taken with me at the finish.
“I know what this means,” I said between laboured breaths, “I’m dead last, aren’t I.” They didn’t get a chance to answer because a petite runner ahead of me passed out. I felt terrible for her but was glad that the cheerful paramedic pair cycled off to someone in much more need of their services–I was still upright, after all.
I was about a mile from the finish when Steve came into focus, “Where have you been?” he asked. I can’t remember my answer it was probably a half-sobbed, “Here.” He gave me a pep talk and got me to keep up with him that last painful stretch. In the end, I got my medal and fell apart like two year-old when I found out there were no orange slices left because all the food was gone to the finishers before me.
I thought life was tough that day because I was poorly prepared for a long hot run and they were out of oranges—until the Arthritis Fairy in her blue tulle dress and cruel twisted wand came to town. About six years ago I went to sleep a busy teacher and recreational athlete to wake up an arthritis sufferer.
Since then it’s been years of constant pain, crutches, physical therapy, occupational therapy, medical leave, water walking, splints, braces, four trials of biologic drugs, off-label chemotherapy, side effects, countless tests, specialists, total life makeover, adjustments, modifications, plethora of changes little and great, losses and gains, like two new walking partners (Chihuahuas Patches and Vicky) and the people who were the keepers, like Steve.
I won’t be running any foot races with inflammatory arthritis affecting over 20 joints mostly in my feet and hands on any given day, but I will be participating in a 1K Walk to fight Arthritis next month with my casual pokey posse—Steve, our kids with paws, niece, Katrina and her entourage, neighbours, friends, their kids and/or their kids with paws. To me this is as big a milestone as any previous athletic accomplishment pre-arthritis.
If you have arthritis or know anyone with arthritis, which strikes people between the ages of two—when sufferers are too young to even say the word–and 102, please consider sponsoring Steve in his half-marathon. There is still much research to be done to make the quality of life better for those of us who suffer from one or more of the 100 different types of arthritis. I’ll be dropping him at the start of the race making sure he has a hat and be there at the end of the race with a bag of orange slices waiting with a big smile on my freckled face. Go Steve Go!
NOTE: For those of you who will ask or just silently wonder about my last name. I didn’t tack on Stephen’s last name to Labonté. Indeed, I kept my maiden name—it just happened to be hyphenated and the second name happened to be Smith. I wasn’t specifically looking for a mate with a last name that was either Labonté or Smith to make things easier in the not needing to change one’s driver’s license department but it was a happy coincidence.
In investigating some performance problems being reported on some systems running Sage 300 ERP, it lead down the road to investigating Windows Bit-Rot. Generally Bit-Rot refers to the general degradation of a system over time. Windows has a very bad reputation for Bit-Rot, but what is it? And what can we do about it? Some people go so far as to reformat their hard disk and re-install the operating system every year as a rather severe answer to Bit-Rot.
Windows Bit-Rot is the tendency for a Windows system to get slower and slower over time. Becoming slower to boot, taking longer to log-in, and taking longer to start programs. Along with other symptoms like excessive and continuous hard disk activity when nothing is running.
This blog posting is going to look at a few things that I’ve run into as well as some other background from around the web.
I needed to investigate why on some systems printing Crystal reports was quite slow. This involved software we have written as well as a lot of software from third parties. On my laptop Crystal would print quite slowly the first time and then would print quickly on subsequent times. My computer is used for development and is full of development tools, so the things I found here, might be relevant to myself more than real customers. So how to see what is going on? A really useful program for seeing what is going on is Process Monitor (procmon) from Microsoft (from their SysInternals acquisition). This program will show you every access of the registry, the file system and the network. You can filter the display, in particular you can filter to monitor only a single program to see what it’s doing.
ProcMon yielded some very interesting results.
My first surprise was to see that every entry in HKEY_CLASSES_ROOT was read. On my computer which has had many pieces of software installed, including several versions of Visual Studio, several versions of Crystal Reports and several versions of Sage 300 ERP, the number of classes registered here was huge. OK, but did it take much time? Well the first time something that’s run that does this it seems to take several seconds, then after this its fast probably because the registry ends up cached in memory. It appears that several .Net programs I tried do this. Not sure why, perhaps just .Net wants to know all the classes in the system.
But this does mean that as your system gets older and you install more and more programs (after all why bother un-installing when you have a multi-terabyte hard drive?), starting these programs will get slightly slower and slower. So to me this counts as Bit-Rot.
So what can we do about this? Un-installing unused programs should help, especially if they use a lot of COM classes. Visual Studio being the big one on my system, followed by Crystal and Sage 300. This helps a bit. But there are still a lot of classes there.
Generally I think uninstall programs leave a lots of bits and pieces in the registry. So what to do? Fortunately this is a good stomping ground for utility programs. Microsoft used to have RegClean.exe, Microsoft discontinued support for this program, but you can still find it around the web. A newer and better utility is Ccleaner from Piriform. Fortunately the free version includes a registry cleaner. I ran RegClean.exe first which helped a bit, but then ran Ccleaner and it found quite a bit more to clean up.
Of course there is danger in cleaning your registry, so it’s a use at your own risk type thing (backing up the registry first is a good bet).
At the end of the day all this reduced the first time startup time of a number of program by about 10 seconds.
My second surprise was the number of calls to check Windows Group Policy settings. Group Policy is a rather ad-hoc mechanism added to Windows to allow administrators to control networked computers on their domain. Each group policy is stored in a registry key, and when Windows goes to do an operation controlled by group policy, it reads that registry key to see what it should do. I was surprised at the amount of registry activity that goes on reading and checking group policy settings. Besides annoying users by restricting what they can do on their computer, it appears group policy causes a general high overhead of excessive registry reading in almost every aspect of Windows operation. There is nothing you can do about this, but it appears as Windows goes from version to version, that more and more gets added to this and the overhead gets higher and higher.
You may not think that you install that many programs on your computer, so you shouldn’t have these sort of problems but remember many programs including Windows/Microsoft Update, Adobe Updater and such are regularly installing new programs on your computer. Chances are these programs are leaving behind unused bits of older versions that are cluttering up your file system and your registry.
Related to auto-updates, it appears that so many programs now run as icons in the task bar, install Windows services or install programs to run when you log-in. All of these slow down the time it takes you to boot Windows and to sign-in. Further many of these programs, say like Dropbox, will keep frequently polling their server to see if there are any updates. Microsoft has a good tool Autoruns for Windows which helps you see all the things that are automatically run and help you remove them. Again this can be a bit dangerous as some of them are necessary (perhaps like a trackpad utility).
Similarly it seems that everyone and their mother wants to install browser toolbars. Each one of these will slow down the startup of your browser and use up memory and possibly keep polling a server. Removing/disabling these isn’t hard, but it is a nuisance to have to keep doing this.
Hard Disk Fragmentation
Another common problem is hard drive fragmentation. As your system operates the hard disk becomes more and more fragmented. Windows has a de-frag program that is either scheduled to run when your computer is turned off or you never bother to run it by hand. It is worth de-fragging your hard drive from time to time to speed up access. There are third party de-frag programs, but generally I just use the one that comes built into Windows.
Related to the above problems, often un-installation programs leave odds and ends files around and sometimes it’s worth going into explorer (or a cmd prompt) and deleting folders for un-installed programs. Generally it reduces clutter and speeds up operations like reading all the folders under program files.
Dying Hard Drives
Another common cause of slowness is that as hard drives age, rather than just out right failing, often they will start having to retry reading sectors more. Windows can mark sectors bad and move things around. Hard drives seem to be able to limp along for a while this way before completely failing. I tend to think that if you hear your hard drive resetting itself fairly often then you should replace it. Or when you defrag if you see the number of bad sectors growing, then replace it.
After going through this, I wonder if the people that just reformat their hard drive each year have the right idea? Does the time spent un-installing, registry cleaning, de-fragging just add up to too much? Are you better off just starting clean each year and not worrying about all these maintenance tasks? Especially now that it seems like we replace our computers far less frequently, is Bit-Rot becoming a much worse problem?
Sage 300 ERP has a fairly flexible mechanism for setting up your General Ledger Chart of Accounts. This is a fairly important activity since it controls how you will be able to run financial reports and dice and slice your financial information. I’m not an Accountant so I might miss some of the finer points of accounting, and it’s always important to follow generally accepted accounting principles as much as possible. In some industries and in some countries your chart of accounts is specified for you. For instance Chine specifies the chart of accounts that companies must use.
So the actual chart of accounts can have some fairly hard constraints on how it’s set up. Fortunately there are some other mechanisms like account groups, account rollup, optional fields and transactional optional fields that can be used to enhance reporting capabilities.
I blogged on the general structure of G/L here. This blog posting is going to look a bit more in depth into the structure of G/L Accounts and how to generate some fairly flexible reports.
Each G/L Account can be up to 45 characters in length (formatted). It can consist of up to ten segments each of which can be up to 45 characters. One segment must be designated as the Account segment. Each combination of segments is called an Account Structure.
The account segments are defined in G/L Options on the segments tab and are stored in GLABK (GL0022). These are the building blocks for the G/L Accounts and must be defined first. In Options UI you also specify the segment separator character and which segment is the Account Segment. Next you would define the Account Structures in the G/L Account Structures setup UI and stored in GLABRX (GL0023), these specify the various combinations of account segments that you will be using. You also specify which is the default structure code. The Account Segment isn’t validated and can be any value. The other segment can only have specific values and you specify these in the G/L Segment Codes setup UI and stored in GLASV (GL0021). Then you would define the Account Groups you desire in the G/L Account Groups setup UI.
With these values setup, you can now enter your Chart of Accounts into the Accounts UI in the G/L Accounts folder. The Accounts are stored in GLAMF (GL0001).
If you are following generally accepted accounting principles you should have an idea of how you want your G/L Accounts structured and you should have an idea of how you want your Financial Reports to be structured which then dictates your Account Groups. The Account Groups specify the normal F/R reporting categories like “Cash and Cash Equivalents”, “Accumulated Depreciation” or “Provision for Income Taxes”.
Now that you have a structure, creating all these accounts one by one sounds rather tedious. Fortunately there is G/L Create Accounts function that will create all your Accounts en masse. Our Chart of Accounts isn’t a sparse system, meaning that you do need to create an Account before you use it, so this is a very useful tool.
There are two types of optional fields associated with G/L accounts, one are optional fields that are associated with the actual Account. These are typically used for reports where you are selecting various G/L Accounts to report on and you can then use these optional fields to control which Accounts. For instance the Chart of Accounts report or the Trial Balance report can print Accounts based on the values of these optional fields.
The other sort of optional fields associated with G/L Accounts are transaction detail optional fields. These control information that is flowing from the other ledgers, like A/R Invoices and will store the values for these fields with the transaction details. You can include these in reports like the Transaction or Batch listing reports.
Ultimately the final output of your ERP system are the Financial Reports. Generally CFOs want to look at their Financial Reports from all sorts of angles and all sorts of categorizations. Sage 300 ERP is quite efficient at Financial Reporting since it stores special Financial Set records when it posts batches that keep a lot of data all pre-calculated and easy to access. These are all stored in the GLAFS table (GL0003).
The main Financial Report UI lets you choose all sorts of range criteria and sort orders. These are very useful for getting out your financial statements, but within the Financial Statement specifications there are very powerful data inquiry functions and you can use the full power of Excel to manipulate the data, create charts, create pivot tables, show geographic distributions, etc.
The big drivers of Financial Reports are the Accounts, the Account Groups and the Account segments. However all the functions that you place in your financial reports like FRACCT, FRAMT, FRPOST take filters that can include all sorts of criteria on the fields in the G/L Account record along with optional fields that are specified like A.ACCTCLASS = ʺSalesʺ for Account optional fields and T.QUANTITY <= 0 for transactional optional fields. Keep in mind that account optional fields would be used for filtering accounts and transactional optional fields for filtering transactional amounts (like in FRPOST).
Usually you use the account groups to generate the main F/R statements by the usual accounting categories and then you restrict the account segments to get say departmental or geographic reports. Then you use optional fields to get more esoteric views of your financial data.
To some degree the power of the Financial Reporter depends on your having setup your Chart of Accounts properly in the first place.
But suppose you’ve been running for a while and realize you didn’t setup up things ideally for your financial reporting needs? You’ve now got lots of transactions posted for G/L accounts which you feel are in the wrong structure? Now what do you do? The answer is the GL Account Code Changer module that is included with G/L (you need to activate this separately from G/L). This module will do a search and replace on the database and change all instances of a G/L account from one thing to another. This way all your committed transactions and financial data will be set to the new account and your financial reports can now be printed off the more ideal new structure. The Account number changer can also change the account segment separator and the segments.
Sage 300 ERP has very rich and flexible features allowing companies to create Charts of Accounts that lead to very powerful reporting capabilities. A good knowledge of the whole process is important to design that perfect Chart of Accounts, but if you make a mistake you can always use the Account Number Changer to fix it.