Stephen Smith's Blog

All things Sage 300…

Posts Tagged ‘Sage

The New Sage 300 Home Page

with 2 comments

Introduction

In this article I’m going into a bit more detail on the features in the Home Page for our new Sage 300 Web UIs. This is basically the launching point for our web accounting screens as well as provides a number of KPIs and user assistance.

We are calling this a Home Page rather than a Portal to avoid confusion with the 6.0A Portal, the Partner Portal and all sorts of other Portals. This isn’t meant to be an all-in-one entry point to everything you do on the Web, its very specific to making your use of Sage 300 easier and avoids cluttering it up with all sorts of other things.

Home Page of the Home Page

The main entry point to the Home Page is shown below. You can get to this view at any point by hitting the Home link next to the Sage 300 logo at the top left. This screenshot also shows the “Add Widgets” menu where you select which KPIs you want to see.

homepage2

Feature Tour

When you first start the Home Page or at any time from the Help menu you can run the feature tour that steps you through the main features in the Home Page to get you familiarized.

homepagefeaturetour

Mega Menu

We refer to the menu where you select the accounting screen to run as the “Mega Menu” due to its size. As you can see it’s arranged in a very similar manner to the current Sage 300 Desktop, so it should be easy for users to find what they need to do. Of course if you are running with security configured (as you should be), then you will only see the items you have access to which greatly reduces the choices to sift through.

homepage3

Menu Customization

Besides using security you can also customize which menu items you don’t want to display in the Home Page. This might be because you want your users to run the VB screen due to existing screen customizations or perhaps to just reduce clutter.

homepagecust

Window Manager and Related Links

The Home Page lets you run up to ten accounting screens at once. There is a Window Manager widget on the right hand side that you use to switch between screens and to close screens you don’t need open anymore. You can slide the Window Manager widget up and down if it’s in the way of something.

Notice on the screen shot below that each screen always displays the group the screen is in along with the screen (a breadcrumb) and some related links along the top, so in this case you can see links for Tax Classes, Tax Groups and Tax Rates. This way usually you can get to where you need to go quickly without using the menu.

homepagewindowmanager

Crystal Reports

We still use the same Sage 300 Crystal Reports that we’ve always used. Only now they are displayed in the Home Page, just like the accounting screens. These will show up in the Window Manager Widget just like any other screen.

homepagecrystal

Help

Of course there is always help available. Below is a screenshot of the frequently asked questions. There is help for all the accounting screens as well as the Home Page.

homepage4

Summary

This was a quick tour of our new Home Page. There are quite a few usability innovations here, and you can expect quite a few more as we move forwards.

 

 

Written by smist08

August 22, 2015 at 5:16 pm

Scaling and Availability for the new Sage 300 Web UIs

with one comment

Introduction

I introduced our new Sage 300 Web UIs, talked about installing them and then discussed security implications. Now the question is that you have hundreds of users and things are starting to run quite slowly, what do you do? Similarly suppose you are all happily using the Web UIs and the web server hardware breaks down or Windows Update kicks in or Windows fails for some other reason? These two problems are quite related since the solution to both is the same, namely adding another Web Server. If you have two Web Servers and one breaks down, then people might run a bit slower since they are all on the remaining server, but at least they keep running. If the Web UIs slow down when there are a certain number of users, then add another web server to distribute the load.

This articles will look at the various issues around adding Web Servers. For the other parts of the system I talked about various techniques here.

Poor Man’s Scaling

Later in the article we’ll talk about automatic failover and automatic ways to distribute load. In this section I just wanted to point out that you can do this manually without requiring any extra configuration, servers or hardware.

Basically just have two Web Servers, each with its own URL (which might just be //servername/sage300) and then just assign which server your users sign on to. You would want each server to have the Sage 300 programs installed locally, but use the same shared data folder and the same databases on the same database server.

Then if one server fails, just send an email to everyone using that server to use the other one. This way it’s pretty easy to add servers, but it’s up to you to distribute your users over the servers and it’s up to you to switch the users from one server to another when one goes down or you want to do maintenance.

Sticky Load Balancer

Ideally we would like to have a pool of Web Servers that are all behind the same URL. Then as users access the URL they will be distributed among the working servers in the pool. If a server fails, this will be automatically detected and it will be removed from the pool.

loadBalancer

There are quite a few hardware load balancers on the market, most of which have the “sticky” feature that we require. Sticky means that once a user starts talking to one server, all their traffic will be directed to that same server, unless it fails. For the Sage 300 Web UIs most of the UIs are what are called stateless and don’t require this feature. However we do have a number of stateful UIs that must communicate with the same server to do things like build up an Invoice or other accounting document.

Most load balancers will detect when a server fails (usually by regularly pinging it) and hence remove it from the pool.

Many load balancers also have the feature of decoding HTTPS for you. So you have an HTTPS connection to the load balancer and then an HTTP connection from the load balancer to the Web Server. This improves performance of the Web Server since decoding HTTPS traffic can be quite computationally intensive.

High_availability_247

If you are thinking “high availability”, you might want to now ask: what happens if the load balancer fails? In this case you would have two load balancers in an active/passive configuration where the passive will take over if it detects the active one has failed.

You also wouldn’t want all your servers to automatically do Windows Update, otherwise they will all do this at the same time and the whole system will be unavailable during this process. It’s a good idea to take control of when Windows Update happens and to stagger it across your infrastructure.

IIS ARR

There are a number of software solutions for load balancing. After all a hardware load balancer is just really a computer with the exact hardware ports required and then runs the load balancer software. One software solution is built into IIS called Application Request Routing (ARR). Here you can have a server in front of your Web server which has ARR configured to know about your pool of servers, have sticky session enabled and off you go.

If you want to make the ARR server HA (Highly Available) you can add a second one. If you have them work in parallel then they need to share a SQL database that you might want to also make HA.

Geographic HA

You might want to be highly available across geographic locations. However keep in mind that there is only one SQL database and the location where that is located will work really well, and the other location will probably have terrible performance. Generally for disaster recovery if something catastrophic happens to one location, you would have a second location that you can bring online reasonably quickly and probably involve restoring the SQL Server database from an off-site backup.

The Cloud

Rather than managing all these servers in your own datacenter, you might consider running them all as virtual servers in a Cloud such as AWS or Azure. Here you can create all these servers and configurations fairly easily. You can also add web servers when you need extra capacity and delete a few when you aren’t using them and want to save a bit of money.

There are lots of arguments between running in the cloud versus running in your own data center. These often revolve around data security, data privacy, cost, and the skills needed to maintain the given system. Whichever is right for you, you will still want to make sure you can configure the correct capacity for your needs and that you have the correct level of disaster recovery, failover and backup for your needs.

Summary

This was just a quick introduction to how you increase capacity of your Sage 300 Web Servers, along with a quick discussion of High Availability. As in all things, the most deluxe solution will be very expensive and no solution will likely be unacceptable. So you will need to find the correct balance for your business.

 

Written by smist08

August 16, 2015 at 12:35 am

Installing the New Sage 300 Web UIs Securely

with 4 comments

Introduction

Sage 300 2016 comes with new Web UIs. With beta release I talked about how to install these, but I didn’t get into the details of securing your setup to be exposed to the Internet. If you just follow the instructions from the last blog post, then you are ok in a protected LAN environment, but need a number of additional steps to go beyond that. A common question is how I set this up in a secure manner so that these new features won’t be exploited by hackers.

Most people will probably just setup Sage 300 running on their local network. If you don’t expose the web server to the internet, then your security concerns are the same as they are today. You are just regulating what bits of information your local users are allowed to see. Generally (hopefully) you aren’t as worried about your own employees hacking your network. The big concern for security here is usually social engineering which really requires good education to prevent. Note however that we have seen sites where people have added Internet access for all their employees, but unwittingly exposed their network to the Internet. It’s never a bad time to re-evaluate your current security to see if there are any weaknesses.

A common way to extend to the Internet if via VPN connections. This usually works well for some devices like laptops but then very badly for others like tablets. If you need better performance and don’t want to worry about supporting VPN clients on a whole variety of devices, then using the standard Internet security protocols is a better way to go. All that being said, if your needs are simple, VPN is a good way to go.

For Sage 300 we’ve taken security very seriously and are involving security consideration into all parts of our Software Development Methodology. Additionally we commissioned a third party security audit of our product. From this audit we then made a number of changes to tighten up our security further. This way we’ve been looking for and being careful about SQL Injection attacks and cross site scripting attacks, among others.

For any site you should do some sort of threat risk modeling perhaps like: http://www.owasp.org/index.php/Threat_Risk_Modeling. Generally this sort of exercise gets you thinking about what you are trying to protect and what the possible threats are. Even if you do something simple like:

  • Identify bad guys – young hackers, disgruntled ex-employees, competitors, etc.
  • Identify assets – databases that you want protected, servers that should be secure, etc.
  • Identify risks – having your data stolen, having your website vandalized, having your data modified.

Then you can develop plans to protect your assets and to watch for your adversaries. You should perform this exercise even if you don’t have any web servers and feel you have a very protected environment.

A lot of security isn’t a matter of being perfect, just being better than others. This way hackers will come across your web site, quickly see it has security in place and then move on to easier targets. Hackers tend to employ automated scripted scanning tools to search the Internet for unprotected servers, just starting by being HTTPS and not having any other ports open, sets the bar quite high for hackers and the scanning tool will keep scanning.

Nmap/Zenmap

When you expose a web server to the Internet, your first line of defense is the firewall. The firewall’s job is to hide all the internally running processes from the Internet, such as SQL Server or Windows Networking. Basically you want to ensure that the only things people can access from the outside are HTTP and HTTPS (these are ports 80 and 443 respectively). This way the only things a hacker can attack are these ports. Generally hackers are looking for other ports that have been left open for convenience like RDP or SQL Server and then will try to attack these.

A great tool to test if any ports have been left open is Nmap/Zenmap. You run this tool from outside your network (perhaps from home) to see what ports are visible to the Internet. Below is a screen shot of running this tool against www.yahoo.com. We see that ports 80 and 443 are open as expected but so are ports 25 and 53 (which are for email authentication and DNS). Since there are 4 ports, as a hacker if I have an exploit for any one of these I can give it a try. Obviously the fewer ports open, the better. Ideally only port 443 for HTTPS (though port 80 is often left open to give a better error message to use HTTPS or to redirect people to HTTPS automatically).

It is well worth running Nmap so you don’t have any surprises, especially since configuring firewalls can be complicated.

nmap

Qualsys and CloudFlare

Zenmap is nice because it’s simple and free. However there are more sophisticated tools available that you might want to consider. For instance Qualsys is a very good commercial security scanner which will do a deeper analysis than Zenmap. If you website is protected by authentication, you might want to run Qualsys against a test system with authentication turned off, then it can do a much more thorough scan of all your web pages (i.e. find vulnerabilities that are only visible if you are successfully logged in).

Another protective layer is to put your site behind CloudFlare. Among other things, this will provide protection against distributed denial of service DDoS attacks. This is where hackers enlist thousands (or millions) of zombie computers to all access your site at once, bringing it down.

HTTPS

Now with your site doesn’t have any unneeded open ports, we need to ensure the web site is only accessed in a secure manner. As a first step we only access it through HTTPS. This encrypts all communications, ensuring privacy and validates that users are talking to the right server avoiding man-in-the-middle attacks.

To turn on HTTPS you need a server digital certificate. If you already have one, then great you are all set. If you don’t have one then you can purchase one from companies like VeriSign.

To turn on HTTP for a web site in IIS, go to the IIS management console, select the “Default Web Site” and choose “Bindings…” from the right hand side. Then add a binding for https, at this point you need to reference you digital certificate for your server.

bindings

As a further step, you should now choose “SSL Settings” in the middle panel and check the “Requre SSL” checkbox. This will cause IIS to reject an HTTP:// requests and only accept HTTPS:// ones.

sslsetting

Other IIS Settings

If you browse the Internet there are many other recommended IIS settings, but generally Microsoft has done some good work making the defaults good. For instance by default virtual directories are read-only so you don’t need to set that. Also remember that Sage 300 doesn’t store any valuable data in IIS, Sage 300 only stores some bitmaps, style sheets and static html files here. So if someone “steals” the files in IIS, it doesn’t really matter, this isn’t where your valuable accounting data is stored. We just want to ensure someone can’t vandalize your web site by uploading additional content or replacing what you have there.

Database Setup

The new Web UIs honors the security settings, set from the Security button in Database Setup. These should be set according to the screen shot below. The most important setting is to disable a user account after x failed password attempts. This prevents automated programs from being able to try every possible password and eventually guessing the correct one. With the settings below and automated program can only try 3 passwords every 30 minutes which will usually get the hacker to move on to find a less secure site to try to hack.

Also ensure security is turned on for each system database, or you don’t need a password to login. Further make sure you change the ADMIN password first since everyone knows the default one.

secsettings

Update 2015/08/15: Its been pointed out to me that a good practice for Database Setup is for each database to have its own DBO and password. Then anyone getting access to one database doesn’t get access to any other. This includes creating a separate DBO and password for the Portal database.

Vigilance

It is generally good practice to remain vigilant. Every now and then review the logs collected by IIS to see if there is a lot of strange activity, like strange looking URLs or login attempts being aimed at your server. If there is, chances are you are being attacked or probed and want to keep an eye on it. If it is very persistent you might want to work with your ISP or configure your Firewall to block the offending incoming IP addresses entirely.

Summary

The important steps are to:

  • Configure IIS for HTTPS (SSL).
  • Disable HTTP (require SSL).
  • Set more stringent security restrictions in Database Setup
  • Do an NMap port scan of your server.

Plus follow normal good IT practices like applying Windows Updates and not running services you don’t need. Practices you should follow whether running a web site or not. Then keep an eye on the IIS logs to see if you are being probed or attacked.

These steps should keep your data and your server safe.

PS

This article is an update to this 2010 article I did for the 6.0A Portal. Now that we have a new Web technology stack a lot of these previous articles will need to be updated for the new technologies and for what has happened in the last five years.

Written by smist08

August 8, 2015 at 4:57 pm

Installing the Sage 300c Web UIs

with 4 comments

Introduction

In my last post I introduced the new Web UIs for Sage 300. This blog post is on installing the new Web UIs for Sage 300. The product has just gone to beta so I thought it might be helpful to provide a few details on the installation process. I also included details about some rough edges in the beta installation. I’ll update this article as we go along with any problems encountered and helpful workarounds discovered. As we go to final release, I’ll clean up this article to represent the final released state.

For testing all the new features in the classic VB UIs then the normal installation procedures apply. This article is specific to the Web UIs.

Prerequisites

The beta only officially supports Windows Server 2012R2 and SQL Server 2014. By release we will support other versions of Windows and our usual suspects for SQL Server. This only applies to the Web Server where you want to host the Web UIs. All the regular versions of Windows are supported for Workstation setup type installs to run the VB UIs. For accessing the Web UIs from other computers and devices just ensure you are using the latest version of Edge, IE, Firefox, Safari or Chrome.

The main prerequisite is that you must install the Web Server (IIS) server role from the Server Manager. Then you should add all the Application Development sub features. The key ones being that we need the .Net ones and the ISAPI ones.

serverrole

Installer Options

The new Web UIs are an option in our installer. They are checked by default, but you only want to install them on the Web Server where they should run from (similar to the older Portal option).

wsinstall

When you select this the files for these will be added in an online folder under the folder you decided to install to. A number of Windows features will be enabled like .Net 3.5 (needed for Crystal) and Message Queuing. Virtual folders and application pools will be added to IIS. No manual configuration or adjustment of IIS is required.

Database Setup

With Sage300 version 6.0A we introduced a Portal database. This held 4 tables required for the 6.0A Portal. The new web screens also need some database tables. We added them to the Portal Database so that you don’t need to worry about yet another SQL database. So like previous versions you need to create a Portal database in SQL Enterprise Studio, and then run Database Setup to configure it from the “Portal…” button. The dialog from database setup is exactly the same as it was in previous versions.

One difference is that the older Portal populated its tables when it ran. The new Web UIs need a few things populated ahead of time and this is done via Database Setup. So if you delete and re-create the Portal Database then you must run Database Setup again to get these tables re-populated.

Running the Web UIs

Now you should be ready to run. On the server there will be start menu link to the new Home Page. For other computers it will be //servername/sage300. You can use the IP address of the server if you don’t have a proper DNS entry for it.

By default we configure IIS to run these from HTTP. If you expose these to the external internet, make sure it is either via VPN or you enable HTTPS (you need a digital certificate) and turn off the HTTP port in IIS.

Beta Issues

The beta still has a few issues so I thought I would list them all here to put them in one place. I may delete this section when the product ships to avoid confusion once these issues are resolved.

  • You must use Windows Server 2012R2 and SQL Server 2014. Installing in a virtual image (like Virtual Box) is fine.
  • You can’t install both the existing (Orion) Portal and the Web UIs on the same web server. Right now the two Crystal runtimes will conflict and Crystal reports won’t work for one or the other.
  • [Update 2015/08/04] For the Portal database due to a bug, you must give this a case-insensitive collation sequence. If its case sensitive or a bin type collation then you will get some serious errors.
  • Database Setup has a bug where the first time you select the Portal… options, you will get an error about dbconfig.xml not existing. Hit Ok again and it should be created fine.
  • After running Database Setup you need to go to Control Panel – Administrative Tools – Services and restart the Sage.CNA.WindowsService service or functions like posting and import/export may not work.
  • Web UIs that use the system link won’t save properly, this includes Security Groups, Authorizations and the Currency functions.
  • [Update 2015/08/18]: If your Organization ID is less than 6 characters long in Database Setup then you will get a sequence error when you try to signon. For now please make your org-id 6 characters in length.

Diagnostics

One problem with the previous Web versions of Sage 300 is that DBSpy and RVSpy didn’t work on Windows Services (starting with Windows XP). Now these programs have been updated to spy on Sage 300 business logic that is run from a Windows service like IIS. So these tools are quite helpful in diagnosing problems in this new Web UI environment.

There are two main components for the new Web UIs both of these are installed under the online folder for the Sage 300 program files. Each has its own diagnostic log which provide useful information when things go wrong (there is also quite a bit of audit information logged here).

Under the online folder the web folder contains all the files and programs that comprise the ASP.Net MVC program that runs from IIS. Here there is a logs folder which contains a trace.log file which will contain various audit information as well as much more detail on any problems that occur in the system.

Under the online folder the worker folder contains files and programs that make up the Sage.CNA.WindowsService. This service processes any long running jobs like posting functions, create new year and import/export. This to make the web server more responsive to other users when someone is running a long running process. Here there is a logs folder that contains a trace.log file which has detailed diagnostic information when something goes wrong here.

folders

If you want to just reset the Web UIs programs, you need to restart the World Wide Web Publishing Service (IIS) and the Sage.CNA.WindowService Windows Services. Note that the Sage 300 Tomcat and Sage 300 .Net Remoting services are still there, but these aren’t used by the new Web UIs.

Summary

We a really excited that the first beta of our new Web UIs is shipping and we are looking forward to getting feedback and input ahead of the main general release in October. The earlier we can get feedback the sooner it can be incorporated into the product. Hopefully this article will help you get up and running and trying out these new screens.

Written by smist08

August 2, 2015 at 6:41 pm

Sage 300 Moves into the Browser

with 35 comments

Introduction

Probably one of Sage’s worst kept secrets is that we’ve been working on a true Web version of Sage 300. Now finally at Sage Summit 2015 the cat is out of the bag and the product is officially announced as Sage 300c. You can see the announcement in this video.

We will be including Web UIs for G/L, A/P, A/R, Bank, Tax, C/S and A/S screens in the upcoming Sage 300 2016 release which is slated for October 2015 (where the year in the product roughly matches car model years).

These are true web screens that run in all the main browsers including IE, Edge, Chrome, Firefox and Safari. There are no plug-ins required, everything is pure HTML, CSS and JavaScript. This means you can access these screens from all sorts of devices like Macs, Windows Desktops and Laptops, Chromebooks, iPads, Android Tablets, Linux Workstations, Raspberry Pi’s and many other devices.

The operations modules including I/C, O/E and P/O will follow shortly after early in 2016 as part of a Product Update.

New PDX Treatment

We have a new Home Page (like the traditional desktop) and all redesigned screens.

Sage300Home

There are 12 KPIs that you can choose from to put on your home page. You run all the screens from the menu and can move between open screens using the little Windows widget on the right.

People familiar with Sage 300 ERP will pick up this new treatment very easily since it is still the same application, just re-imagined to fit into a modern Web environment.

cna2arinv

All the familiar elements are there including finders, import, export and printing via Crystal Reports. There are a lot of extra visual cues to help make the screens easier to learn and use. For instance you don’t have to know to hit tab when in a key field to get it processed, you can just hit the little go button next to it (but of course hitting tab still works as well). In the grid you don’t need to know to hit the Insert key to create a new line, there is an explicit “Add Line” button at the top of the grid.

The controls aren’t as packed together as they were in the classic VB UIs. This is to make it easier to use the screens on touch devices like tablets. Now it’s much easier to touch buttons and controls with your fingers when you don’t have the fine control of a mouse. Although these screens work quite well on tablets, it isn’t recommended you use them on a phone since this then involves a painful amount of scrolling and having a more responsive design for phones is still a version or two away.

Hybrid Offering

If you upgrade to this release, you don’t need to run the Web UIs. You can still run all the classic VB UIs. In fact you can have some users running the new Web UIs and some users running the classic VB UIs. Since everything goes through the same business logic (Views) you can mix and match as you like. Of course for modules like PJC where there aren’t any Web UIs yet, then you must run the VB UIs.

There isn’t a full Web based Financial Reporter yet, so Financial Reporting is still done though the usual built in F/R that is included with G/L or via Sage Intelligence. Both of these are based on the on-premise version of Excel.

Many customers rely on customized versions of the Sage 300 screens. There won’t be a customization toolkit with the first release, it will be provided later. Even if it was available with the initial release, not everyone would want to immediately port their customizations to the new technology. It takes time overcome the learning curve and become proficient in the new technologies. To help with this scenario we provide a tool that lets you hide any UIs in the Web home page, so that users won’t accidentally run the Web UI when you want them running the VB UI (probably due to customizations).

New Technology Stack

This new architecture is based on using standard ASP.Net MVC along with re-using the standard Sage 300 Business Logic. Below is an architectural block diagram of the new components that sit on top of our traditional business logic.

cna2arch

To speed bringing the product to market we were careful to stick to standard off the shelf technology components rather than creating out own. For instance the UI widgets like the editable grid, the graphical charts and the date picker are from the Kendo UI toolkit. We use knockout.js for databinding. We make extensive use of jQuery and leverage quite a number of other standard technologies that are all tried and true in many large scale web applications. All the server code is written in C# and all the Browser code is written in JavaScript.

You can see that there is a Wrapper layer than translates the world of Sage 300 Business Logic into the world of ASP.Net MVC. This way our application is a very standard implementation where the Wrapper layer exposes the standard Sage 300 business logic in more modern interfaces allowing the usage of newer languages and technologies like LINQ in the higher layers.

Lots of Questions

I imagine many people have lots of questions on everything that is going on. This is the first of many articles on the new Sage 300 Web UIs where I’ll be going into much more technical detail. But if you leave questions in the comments of this article, I will endeavor to either answer them or make them the topic of a future posting.

Summary

We are really excited to finally be launching our Web UIs for Sage 300 into the real world. This is a first step in fully modernizing the product and providing a strong foundation for future development.

Our R&D department has been operating using Agile development methodologies and a continuous delivery process. After the first release we are looking to release upates very frequently to add value very quickly to this rather large base release.

Written by smist08

July 29, 2015 at 12:40 am

Performance and the Sage 300 Views Part 2

with 2 comments

Introduction

Last week we discussed avoiding table scans when using the Sage 300 ERP APIs. This week we are going to look at some other issues to do with updating data and with processing meta-data.

Last week I showed a cheetah running as an example of performance and speed (the fastest land animal), but this week here she is resting and getting some attention.

cheetah3

AOM/UI Info/ViewDoc

First, just if you are wondering where to find out what indexes a View supports, there are quite a few tools to determine this. Plus you can always look in SQL Management studio, but then you won’t know which index it is by our numbering scheme. Anyway ViewDoc is a good tool that comes with the SDK that gives this information. UI Info comes with System Manager and can drill down through the UI Info to get detailed View Info. Then there is the Sage 300 Application Object Model (AOM) located here. Just note that to use the AOM, you must use Internet Explorer for some obscure reason.

Updating Data

Often if you are manipulating lots of records it’s in a header/detail situation. In this case all the database operations are done when you insert or update the header. The nice things about this is that the Views know a lot about our database API and will do this in an optimal manner so you don’t need to worry about it. Similarly if you delete a header, the View will delete all attendant details for you in an efficient manner.

But suppose you want to update a bunch of records using our .Net API and want to know the most efficient way to do this. Say we want to add something to the end of every A/R Customer Name. Our easy brute force way to do this would be:

arCUS.RecordClear();
while (arCUS.Fetch(false))
{
arCUS.Fields.FieldByName(“NAMECUST”).SetValue(
arCUS.Fields.FieldByName(“NAMECUST”).Value + “A”, false);
arCUS.Update();
}

 

This works but you might find it a bit slow. We can speed it up quite a bit by bracketing the whole thing in a database transaction:

mDBLinkCmpRW.TransactionBegin();
arCUS.RecordClear();
while (arCUS.Fetch(true))
{
arCUS.Fields.FieldByName(“NAMECUST”).SetValue(
arCUS.Fields.FieldByName(“NAMECUST”).Value + “A”, false);
arCUS.Update();
}
mDBLinkCmpRW.TransactionCommit();

 

The times from the sample program (the same one as last week but with a bit added) is:

Time to update all customers: 00:00:00.087
Time to update all customers in a transaction: 00:00:00.038

So putting things in a database transaction helped. This is for Sample Data so there are only a few customers. The updated sample program is the PerformanceSamples project located here (both folder and zip file).

Database Transactions

Generally when using our API you don’t need to worry about database transactions, but occasionally like in the above example they are necessary. In the above example the first method has the side effect that each update is done in a separate transaction. That means you have the overhead of starting and committing a transaction with every record update. In the second example we start a transaction so all the records are committed as a single transaction. Strictly speaking the two examples don’t do the same things, if the first example throws an exception part way through then all the updates done up to that point will be in the database, whereas in the second example they will be discarded since the transaction will be rolled back. This difference can be quite important if there are database integrity issues to consider. Generally Sage 300 ERP uses transactions to go from one state where the database has full integrity to another. This way we can rely on database transactioning to always maintain full database integrity.

There is overhead to setting up and committing a transaction, but there are also resources used for every operation done inside a transaction. At some point the above example will start to slow down if you have too many A/R customers. Generally you might want to commit the transaction every thousand customers or so for optimal performance (but make sure you maintain database integrity along the way).

Also keep in mind that while records are updated in a transaction they will become locked from the point of update through to the end of the transaction, so updating a lot of records in a transaction will lock a lot of records and cause anyone else going to read that record to have to wait until your transaction completes. So try to keep transactions quick. Definitely don’t do any UI type operations in the middle of a transaction (like asking the user a question).

Revisioned Views

Revision List type views will store all insert/updates/deletes in memory until you call Post. Generally these are detail views and you don’t see this functionality because it’s handled by the header. But occasionally you may need to deal with one of these (like perhaps GLAFS). In this case since each Post is a transaction, you just need to be aware of how often you call it as this will have the same effect on performance as mentioned above.

Deleting

Although you can delete records as above just replacing the Update with a Delete call, there is a better way. The Views have a FilterDelete method where you pass in a browse filter and all the records that match will be deleted. This will prove to be quite a bit faster than the above.

Meta-Data

If you run RVSpy with all the View calls selected you will see a lot of meta-data calls, getting information on fields and such. Generally meta-data calls are quite fast and don’t involve going to the database. However if you really go crazy you can slow things down quite a bit. If you make everything dynamic then you could end up making lots of meta-data calls and cumulatively these slow you down a bit. Similarly using constants in things like getting fields are slightly faster than passing field names because you avoid a dictionary lookup (admittedly quite fast but not as fast as direct access). Mostly people exercise good judgement and don’t go too wild driving everything from meta-data, but we have seen some crazy cases.

Summary

Just a quick overview of some performance tips. Hopefully these all help to make your use of the Sage 300 API more efficient.

 

Performance and the Sage 300 Views Part 1

with 5 comments

Introduction

The Sage 300 ERP Views (Business Logic) give you a great deal of power to perform Accounting operations through our various APIs. However as in any programming, performance must always be taken into account. The Sage 300 ERP Views have a lot of features to help you perform operations with good performance, but like anything if they are used incorrectly, performance can be miserable.

This article is going to talk about various features and options that you can take advantage of to improve your application’s performance. As I am writing the article, it’s getting quite long, so I think I’m going to break it into two parts.

cheetah2

Measure and Test

One of the big mistakes people make when performance tuning, is to just make assumptions and changes without doing real measurements. If you have your code in a source control system, first establish a base line for how long something takes, then make you changes and re-measure the time. Only check in your changes if the time is faster, if it isn’t then you are just churning your code and potentially adding bugs. Performance is subtle and often the best ideas and intentions just make a process slower.

Multi-User versus Single-User Performance

This article is about optimizing processes for single users. Often if you want to optimize better multi-user throughput then it’s all about reducing locks and keeping resource usage down. Sometimes these goals align, i.e. 1 person doing something quicker translates to 100 people doing things quicker, sometime they are opposing, i.e. one person can do something way quicker if he takes over all available resources at the detriment to everyone else.

Read-Only versus Read-Write

You can open our database links and views either in read-write mode or read-only mode. Generally if you aren’t updating the data then you want to open in read-only mode as this makes things quite a bit faster. If you might update the data then we have to use more expensive SQL operations so that if you do update the data, the update is fast and multi-user considerations are handled. If you open a table or link read-only then we use much lighter weight SQL operations and the data is returned much quicker. Finders use this to display their data quicker.

FilterSelect/FilterFetch versus Browse/Fetch

When you Browse/Fetch you can always update or delete the record fetched. As mentioned above that can introduce extra overhead and slow things down. Making the table or link read-only will help Browse/Fetch, but perhaps a better method is to use the FilterSelect/FilterFetch methods which are better optimized for SQL Server than Browse/Fetch. The results from these can’t be updated or deleted but at the same time the access method is always light weight whether the link is open read-only or read-write.

Indexes

Sage 300 will always use an index to read data. We have a lot of code to optimize access based on available indexes. If you use the indexes provided your code will be much faster.

For example, suppose you want to know if there are any open G/L Batches. A quick bit of code to do this is:

glBCTL.Browse(“BATCHSTAT=1”, true);
bool isOpenBatch = glBCTL.GoTop();

This works pretty good on sample data, but then you go to a client, suddenly this becomes quite slow. The reason is that since BATCHSTAT isn’t part of the primary index, the GoTop basically goes looking through the Batch table until it reaches the end or finds an open batch. Since open batches are usually at the end, this tends to be sub-optimal. Practically you could speed this up by searching through the table backwards since then you would probably find one quicker, but if there are no open batches you still search the whole table. Fortunately there is a better way. The GLBCTL table has two indexes, one is its primary default index of BATCHID and the other secondary index is on BATCHSTAT and BATCHID (to make it an index without duplicates). So it makes sense to use this index:

glBCTL.Order = 1;
glBCTL.Browse(“BATCHSTAT=1”, true);
isOpenBatch = glBCTL.GoTop();

Simple adding the Order property makes this search much quicker. I included a sample program with timers and the full code. The results on sample data show the speed difference (not that it was all that slow to start with):

Time to determine if there are open batches: 00:00:00.034
Time to determine if there are open batches take 2: 00:00:00.007

The sample program is located here. Its PerformanceSamples one (folder and zip).

So generally you want to use an index that matches the fields that you are searching on as much as possible. Usually having clauses in your browse filter that uses the index segments from left to right will result in the fastest queries.

This example may look a little artificial, but once you get into the operational modules like O/E and P/O this becomes crucial. That is because the main tables like the Order Header have a uniquifier as the primary index. When you want to look something up it’s usually by something like order number and to do this efficiently you have to use an alternate index. So once you are using these modules you will be using alternate indexes a lot. In these modules also be careful that quite a few alternate indexes allow duplicates, so you might get back quite few records unexpectedly.

RVSpy/DBSpy

RVSpy and DBSpy are good tools for identifying bad behavior. The logs contain time information so you can see where the time is being used, but more often than not doing something bad for performance results in a series of operations appearing over and over in these logs. Usually scrolling to the middle of the output file is a good way to see something going awry. You can also use SQLTrace or ODBCTrace, but I find these slightly less useful.

When using RVSpy for this purpose, it helps to turn off logging to a Window (slow) and only log to a file (make sure you specify one). Further choose the View calls you want to log, usually disabling anything to do with meta-data and anything that is field level.

So if you see output like:

[5b8.7ff.37b0] CS0003: CSCCD    [01:12:06.58].Fetch(view=0x2F1047AC)
[5b8.7ff.37b0] 0 <==[01:12:06.58;t=0;ovh=0] {}
[5b8.7ff.37b0] CS0003: CSCCD    [01:12:06.58].Fetch(view=0x2F1047AC)
[5b8.7ff.37b0] 0 <==[01:12:06.59;t=0;ovh=0] {}
[5b8.7ff.37b0] CS0003: CSCCD    [01:12:06.59].Fetch(view=0x2F1047AC)
[5b8.7ff.37b0] 0 <==[01:12:06.59;t=0;ovh=0] {}
[5b8.7ff.37b0] CS0003: CSCCD    [01:12:06.59].Fetch(view=0x2F1047AC)
[5b8.7ff.37b0] 0 <==[01:12:06.60;t=0;ovh=0] {}
[5b8.7ff.37b0] CS0003: CSCCD    [01:12:06.60].Fetch(view=0x2F1047AC)
[5b8.7ff.37b0] 0 <==[01:12:06.60;t=0;ovh=0] {}

Going on for pages and pages then you have something wrong.

Avoid Table Scans

Most of this article is about avoiding table scans, but just to re-iterate table scans are bad. People are often fooled by testing on sample data. Many of the tables in sample data are quite small and it doesn’t really matter what you do. However in the real world with real customer databases things will usually be quite different. For instance sample data has 9 tax authorities, which you might think is reasonable. But in the USA where any municipal government agency can charge a sales tax, there are over 35,000 tax authorities. If you read all these (like to populate a combo-box to pick one from), then you will run very slowly and your customers will be unhappy.

Summary

Sage 300 ERP has many mechanisms to access and manipulate data efficiently. But as with anything in programming, if you use APIs without due care and attention then performance (and quality in general) will suffer.

Written by smist08

March 10, 2015 at 9:44 pm

Follow

Get every new post delivered to your Inbox.

Join 316 other followers