Stephen Smith's Blog

All things Sage ERP…

Posts Tagged ‘sage accpac

Sage 300 ERP Desktops Through the Ages

with 7 comments

Introduction

With our upcoming 2014 version of Sage 300 ERP (to be released in 2013), one of the features is an improved look to our Windows Desktop Launcher. I thought it might be fun to do through a bit of history and review all the various Desktops we’ve included with Sage 300 ERP.

16-Bits

The first Windows version of Sage 300 was for the 16 Bit Windows 3.1 system. In these earlier versions of Windows, there was no ability to have folders in your lists of icons. Having nested folders of icons was a big feature of IBM’s OS/2 Desktop. IBM released this icon/folder system as a custom control for Windows. We took the IBM library and used it as a basis for our original desktop shown below:

desktop30

Here you could have folder (like Bank or Tax Services) which drilled down to further Windows. Basically using the innovative OS/2 technology of the day. Otherwise this desktop was implemented as a standard Windows MDI application written in C. This was the desktop used exclusively for versions 1.0A to 3.0A. It was also used for the 16-bit versions of 4.0A and 4.1A.

32-Bit

IBM never produced a 32-Bit version of their OS/2 control library for Windows. As a result we had to create a new desktop for our 32-Bit version, which we did using the Microsoft MFC framework. This is now in the age of Windows 95 where Microsoft now has nested icons and tree controls as introduced in the Windows 95 file explorer. These new controls are easy to use from MFC and we created a new desktop using these written in C++. This desktop has been included in all our versions since 4.0A. With 4.2A we dropped the 16-Bit version and this became our only desktop for a short while.

desktop32

The toolbar and status bar are standard MFC application features. The licensing pane is created by embedding an Internet Explorer ActiveX control.

The Web

With version 5 we introduced our first web version. This was based on building our UIs in VB6 and compiling them as ActiveX controls. We could then run them in an Internet Explorer window. The controls would downloaded and installed automatically and they would communicate back to the server using DCOM or .Net Remoting.

But now we needed a way to select and launch the UIs this way, so we created and ASP application to do this. This was our Web Desktop.

webdesktop

One feature of this desktop is that since we are now Web, we aren’t limited by the usual Windows icons sizes, so all the “icons” are actually 80×80 pixel bitmaps which look quite a bit better than on the standard Windows desktop.

Of course the previous 32-Bit desktop is still in the product for the majority of people that aren’t running web deployed. As it turns out this desktop was never used much because the majority of people that used our web deployment mode only used it from Sage CRM and as a result those UIs were run by CRM rather than our Web Desktop.

The Sage Desktop

Shortly after our acquisition by Sage there was a companywide initiative to standardize the desktop/launcher program across all Sage products. This was the Sage desktop.

sagedesktop

This desktop was written in C# as a standard .Net WinForms program. It included things like desktop notifications and news from Sage. Had a modern look and supported most things expected in the desktop. However Sage 300 ERP and CRE were really the only applications in Sage that adopted this and eventually development was discontinued. This desktop was included in versions 5.5A and 5.6A.

The Sage 300 Web Portal

With version 6.0A we introduced the new Web Portal which included data snapshots and a data inquiry tool in addition to the ability to launch screens.

oriondesktop

For more information on this Portal, have a look at this blog posting.

Updated Windows Desktop

With the forthcoming release of Sage 300 ERP 2014 we will be updating the standard Windows desktop that we introduced back in version 4.0A with an updated look. We have all new icons for both the programs and for the toolbar. Plus a new look for the toolbar.

desktop62

This gives the standard Sage 300 desktop an updated look. Especially by refreshing all those icons that have been in the product since version 1.0A.

Summary

A lot of attention gets spent on the Desktop/Launcher program since it is usually people’s first impression of running our program. Although most people do most of their work in a program like Order Entry, it’s still good to keep improving the Desktop. Looking back at the desktop’s you can see the influences of the various technologies that were popular at the time.

Written by smist08

June 16, 2013 at 10:28 pm

User Roles and Security in Sage 300 ERP

with one comment

Introduction

Role based security and user roles are terms that are in vogue right now in many ERP systems. Although Sage 300 ERP doesn’t use this terminology, it is essentially giving you the same thing. This blog looks a bit at how you setup Sage 300 ERP application security and how it matches role based security.

Users

First you create your Sage 300 ERP users. This is a fairly straight forward process using the Administrative Services Users function.

user1

Here you create your users, set their language, initial password and a few other security related items.

Security Groups

Security Groups are your roles. For each application you define one of these for each role. For instance below we show a security group for the A/R Invoice Entry Clerk role. In this definition we define exactly which functions are required for this role.

secgrp

Some roles might involve functions from several applications in this case you would need a security group for each application, but they can all be assigned together for the role.

User Authorizations

User Authorizations is where you assign the various roles to your users. Below I’ve assigned myself to the A/R Clerk role.

userauth

If multiple applications are involved then you would need to add a group id for each application that makes up the role.

Thus we can create our users. We can create our roles which are security groups in Sage 300 ERP terminology and then assign them to users in User Authorizations. As you can see below signing on as STEVE now results in a much more uncluttered desktop with just the appropriate tasks for my role.

desksec

Further Security

As you can see above in the Users screen there are quite a few security options to choose from depending on your needs. One thing not to forget is that there are a number of system wide security options that are configured from the Security… button in Database Setup.

dbsec

Also remember to enable application security for the system database for you companies. For many small customers, perhaps application security isn’t an issue. I’ve also seen sites where everyone just logs in as ADMIN. But if you have several users and separation of duties is important then you should be running with security turned on.

dbsec2

Where is Security Implemented?

In the example above we see how security has affected what the user sees on their desktop. Generally from a visual point of view we hide anything a user does not have access to. This means setting up security is a great way of uncluttering people’s workspaces. However this is a visual usability issue, we don’t want people clicking on things and getting errors that they aren’t allowed. Much better to just provide a cleaner slate.

But this isn’t really security, perhaps at most it’s a thin first layer.  The real security is in the business logic layers. All access to Sage 300 functions go through the business logic layer and this is where security is enforced. This way even if you run macros, run UIs from outside the desktop, find a way to run an import to something you don’t have access to, it will all fail if you don’t have permission.

Summary

Sage 300 ERP security is a good mechanism to assign users to their appropriate roles and as a result simplify their workspace. This is important in accounting where separation of duties is an important necessity to prevent fraud.

Sage 300 ERP – Data Integrity

with 4 comments

Introduction

Modern ERP systems maintain a company’s full financial history for many years. People want to be confident that all that data is correct and makes sense. So how can you be confident that your database has full referential integrity? Especially after years and years of operation. The Sage 300 ERP Data Integrity function is a way to validate the integrity of a database. Modern computers are much more reliable these days than when our Data Integrity function was originally written, but it still serves a good purpose. In this article we will explore some of the protections to protect data integrity in Sage 300 along with some of the possible causes of corruption.

Database Transactioning

The number one protection of data integrity in Sage 300 is database transactioning. Data is always written to the database in a database transaction. Database transactions always take the database from one state with full data integrity to the next state with data integrity. A database transaction is guaranteed by the database server to either be all written to the physical database or none of it is written, the database server guarantees that you will never see part of a transaction.

For instance as we post a G/L batch, we post each entry as a database transaction and since each entry in a G/L batch must be balanced we guarantee via database transactioning that the G/L is always in balance and hence data integrity is maintained.

Where Do Integrity Errors Come From?

Database transactioning sounds great, and in fact with database transactioning we see very few database problems in Sage 300 ERP. But when we do get integrity problems where do data integrity errors come from?

Below is a list of some of the main causes of data integrity problems. I’m sure there are more. I’m not looking to blame anyone (including myself), just to point out the main causes I’ve seen:

  • Bugs in the program. If Sage 300 asks SQL Server to store incorrect data, it will do so in a completely reliable transactional manner. Hopefully our QA processes catch most of these and this doesn’t happen often; but, Sage 300 is a large complicated program and mistakes happen.
  • People editing database tables directly in SQL Server Enterprise Manager. For various reasons people might try to put something in the database that the program doesn’t allow, and often this leads to database corruption.
  • Third party programs that write to the Sage 300 database directly. We do a lot of data validation checking in our business logic before allowing data to be written to the database, but if this is bypassed then corruption occurs. A common one in this case is not handling currency decimal places correctly.
  • Data validation needs to be tightened. Now and again, someone has written data that we accepted as valid that wasn’t. Then we had to tighten our data validation routines. The good news here is that we’ve been doing this for a long time now.
  • Bug in the Database Server. We’ve seen database indexes get corrupted which can lead to further problems either after the indexes are fixed (because of other data written as a result).
  • Partial backups or restores. We’ve seen people back up the tables for each application independently and then restore them. Perhaps to try to put A/R back to yesterday. But this corrupts the database since there is often matching data that needs to be in sync in Bank, Taxes or perhaps Order Entry. Make sure you always backup and restore the database as a whole.
  • Hardware glitches. Even with CRC checking and such, strange errors can start to appear from hard disk or memory hardware failures in computers.

The Data Integrity Checker

To find these sort of problems Sage 300 ERP has a data integrity checker in its Administrative Services. The main screen looks like:

dataint1

You select the applications you want to check and whether you want to fix any minor errors. Since this can be a long process, for several applications you can also configure which parts within the application to check by selecting the application and choosing the application options in a screen like:

dataint2

The end result is a report is run that lists all the errors found.

What Does the Integrity Checker Look For?

So what does the Integrity Checker do? Below is a list of some of the checks that are typically made:

  • Check the integrity of each file reading each record and calling the View Verify API which will call the business logic to validate the record. This includes things like checking the decimals of a money amount are correct, that the data is correct for the data type, that foreign keys are valid.
  • For Header/Detail type relationships there are often total or summary fields in the header like the total amount of an order or the number of detail lines. The integrity checker will read through the details and add up any of these numbers to ensure they match the header.
  • Check the database for any detail records that don’t have a matching header record (orphans).
  • Each application then knows about all sorts of cross file relationships that must be maintained and the Integrity Checker for that application will validate all of these relationships.

What Does Fix Minor Errors Do?

There is the check box to fix minor errors, but what does it do? Mostly it fixes up header/detail relationships by fixing any total or summary fields in header records. It can also delete orphan detail records. But generally it doesn’t attempt much because we don’t want to risk making things worse.

But it’s Slow

The big complaint about the Data Integrity checker is that it’s slow. This is because it does go through every record in the database as well as checking all the cross dependencies. These days we see company databases that are hundreds of gigabytes in size. Generally the complaint is that you can’t just run it as routine maintenance overnight. That you tend to have to configure what you want to run and do that selectively. It’s also best to run it when people aren’t in the system since it does put a fair bit of load on the system.

But this does open up an opportunity for third party developers. Companies like Tairox and Orchid offer solutions to automate data integrity or to run it as a service.

Summary

Even with super reliable modern databases and hardware, data integrity errors can still creep in and need to be dealt with. Just being aware they exist is half the battle. Also remember that it is extremely important to have regular full backups of your data in case of a really catastrophic failure.

 

The Sage 300 ERP Java API

with 9 comments

Introduction

With version 6.0A of Sage 300 ERP we introduced a native Java API to all the Sage 300 Business Logic (Views). We did this in support of our SData implementation which we wrote in Java. This API allows Java programmers to access all the Sage 300 ERP business logic along the same lines as our .Net API and our COM API. This API isn’t built on top of either COM or .Net, it talks directly to the underlying C DLLs in System Manager. This then provides better performance, as well as allows us to compile this part of the system for Linux with no Microsoft dependencies. Internally we usually refer to this API as SAJava.

All the Sage 300 Business Logic objects have the same API, this makes it easier for us to produce these different APIs to facilitate interoperability with all sorts of external systems, allowing the programmers there to write code in a natural manner where any required interop layer is provided by us. The Java API uses a Java Native Interface (JNI) interop layer to talk to our Windows DLLs (or Linux shared objects). This is a one way communication where we only use this to call the DLLs, we never have the DLLs calling our Java code (this direction is often dangerous and leads to the problems often encountered with JNI). Our JNI code handles all the data conversions between Java and C as well as provides exception handling to trap and handle exceptions that can happen in C code (like bad pointers).

I’ve blogged about this API a bit indirectly in the past when talking about how to write server side code for our SData service, for instance here, here and here. Generally to add custom programming to SData feeds you write Java classes that inherit from our standard SData classes to provide this. When you interact with the Views in this environment you use this Java API, but all the libraries are already included and all the details of signing on are handled for you. The framework starts you off at a point where you can directly open and call Views. In this posting we’ll back up a bit to cover full usage, unlike the case where the SData programming framework does a lot of the work for you. So that you can use this API directly in isolation without requiring any other framework.

Getting Started

First to use the Java API, you need to include its jar file into your project. This file is located in the Tomcat\lib folder. This changed a bit between version 6.0A and then the 2012 version. For 6.0A the folder is: C:\Program Files (x86)\Common Files\Sage\Sage ERP Accpac\Tomcat\lib and the file is SystemManager.jar. For the 2012 version the folder is: C:\Program Files (x86)\Common Files\Sage\Sage 300 ERP\Tomcat\lib and the file is com.sage.accpac.sdk.accpac.sajava-6.1.jar. Then you need to import the classes into any source file that uses them via:

import com.sage.accpac.sm.*;

Once you have these things included in your Java project you can start creating objects and calling methods. However due to security you first must sign-on to a session and then create all other objects from this session.

The documentation is in the form of JavaDoc and is located on the DPP Wiki. The 2012 version is here: http://dppwiki.sage300erp.com//javadocs/v6.1/SystemManager/. You can find all the classes, methods and properties here. To access this, you must be part of the Sage 300 ERP Developer Program. A key benefit to joining this program is access to this wiki which contains all the developer documentation that we produce.

Signing On

First you must create a session with some code like:

Session session;
session = new Session(new ProgramSet(), new SharedDataSet(), "ADMIN",
     "ADMIN", "SAMINC", new Date());

This will sign-on your session to the company SAMINC using today’s date as the session date. The ProgramSet and SharedDataSet are used when we deploy in a hosted configuration and run multi-tenant. In this case they must be setup correctly by the system to configure which tenant this session is for. In most normal on-premise applications the indicated calls are fine to give the one default tenant that exists.

Then you must create a program from the session:

Program program;
program = new Program(session, "XZ", "XZ0001", "61A");

If you read my last blog post, this might appear a bit backwards to the COM API where this looks like the session.Init call that comes first. This is true, but the information is required regardless.

Using Views

Now that you have a program you can start opening and using Views. As an example, let’s look at a method that enters A/R Invoices. Like many things I started with macro recording to get the right Views and some syntax. Macro recording produces VBA code, but it isn’t hard to convert this to Java quickly. Anyone familiar with Sage 300 ERP macro recording will recognize the style and variable names in the following method. This method assumes there are class variables for the program and session that were created as indicated above. The key point of the following example is to show how to open Views, compose Views and then use the Views. For more general information on Sage 300 ERP’s Views have a look at this and this.

    public String enterARInvoices()

    {

        int iEntry;
        int iDetail;
        int numEntries = 20;
        int numDetails = 5;
        String sBatchNum;

        View ARINVOICE1batch = new View(program, "AR0031");
        View ARINVOICE1header = new View(program, "AR0032");
        View ARINVOICE1detail1 = new View(program, "AR0033");
        View ARINVOICE1detail2 = new View(program, "AR0034");
        View ARINVOICE1detail3 = new View(program, "AR0402");
        View ARINVOICE1detail4 = new View(program, "AR0401");
        View ARCUSTOMER1header = new View(program, "AR0024");

        ARINVOICE1batch.compose ( ARINVOICE1header );
        ARINVOICE1header.compose (ARINVOICE1batch, ARINVOICE1detail1, ARINVOICE1detail2, ARINVOICE1detail3, null);
        ARINVOICE1detail1.compose (ARINVOICE1header, ARINVOICE1batch, ARINVOICE1detail4);
        ARINVOICE1detail2.compose (ARINVOICE1header);
        ARINVOICE1detail3.compose (ARINVOICE1header);
        ARINVOICE1detail4.compose (ARINVOICE1detail1);

        // Create the batch

        ARINVOICE1batch.recordGenerate(RecordGenerateMode.Insert);
        ARINVOICE1batch.set("PROCESSCMD","1");      // Process Command

        ARINVOICE1batch.process();
        ARINVOICE1batch.read(false);

        sBatchNum = ARINVOICE1batch.get("CNTBTCH").toString();

        // Loop through creating the entries

        for ( iEntry = 0; iEntry < numEntries; iEntry++ )
        {
            try
            {
                ARINVOICE1detail1.cancel();
                ARINVOICE1detail2.cancel();
                ARINVOICE1header.recordGenerate(RecordGenerateMode.DelayKey);
                ARINVOICE1detail1.recordClear();
                ARINVOICE1detail2.recordClear();

                ARINVOICE1header.set("PROCESSCMD","4");

                ARINVOICE1header.process();

                if ( false == ARCUSTOMER1header.goNext() )
                {
                    ARCUSTOMER1header.goTop();
                }

                ARINVOICE1header.set("IDCUST", "1200");

                for ( iDetail = 0; iDetail < numDetails; iDetail++ )
                {
                    ARINVOICE1detail1.recordClear();
                    ARINVOICE1detail1.recordGenerate (RecordGenerateMode.NoInsert);
                    ARINVOICE1detail1.process();

                    ARINVOICE1detail1.set("IDITEM", "CA-78" );                     // Item Number

                    ARINVOICE1detail1.insert();

                }

                ARINVOICE1header.insert();
            }
            catch( Exception e )
            {
                int count = program.getErrors().getCount();
                if ( 0 == count )
                {
                    e.printStackTrace();                   
                }
                for ( int i = 0; i < count; i++ )
                {
                    System.out.println(program.getErrors().get(i).getMessage());
                }
            }
        }
        ARINVOICE1batch.dispose();
        ARINVOICE1header.dispose();
        ARINVOICE1detail1.dispose();
        ARINVOICE1detail2.dispose();
        ARINVOICE1detail3.dispose();
        ARINVOICE1detail4.dispose();
        ARCUSTOMER1header.dispose();

        return( sBatchNum );
    }

Notice that you can explicitly close things by calling the dispose method. This is usually preferred to waiting for the Java garbage collector to reclaim things, it tends to keep down resource usage if you are opening and closing things a lot.

Errors

If a call fails, there are a couple of cases. If it’s a simple expected thing like reaching the end of records when fetching through them then the routine will return a simple return code that you can easily handle in your code. If something worse happens then the routine will throw an exception. As in other Sage 300 ERP APIs, there is an error stack which will contain possibly a number of error messages explaining what went wrong. In the catch expression above we first check if there are any errors on the error stack, if not then we print the stack trace to allow debugging of what went wrong. Otherwise we loop through the Sage 300 errors and print them for diagnostic purposes. When programming Sage 300 ERP, always make sure you have an error handler as it can give you very good information when debugging your program.

Summary

The Sage 300 ERP Java API gives yet another tool for integrators to integrate to Sage 300 ERP from external systems. It is ideal for Java programmers who would like to write their integration entirely in Java. This is often a benefit when the SDK for the external system is itself written around the Java programming language.

Opening Sage 300 ERP Sessions

with 9 comments

Introduction

Sage 300 ERP has a number of very flexible external APIs that allow programs to access all the business logic in the program. The business logic is stored in Views that are accessed via a standard API. To start using the business logic from one of our external APIs you first need to sign-on to the API and establish a session. This article is going to only talk about the AccpacCOMAPI which is our main COM API. Sage 300 ERP has an older COM API usually referred to as a4wcom, so be sure to use the newer one we are talking about here. Many of the concepts can be adapted to other APIs like the .Net or Java APIs. However to interact with other COM components like the session manager you must be using the AccpacCOMAPI. The examples in this posting will all be in Visual Basic 6.

This API has been around for a long time, but we recently received quite a few queries through customer support on establishing connections. So I thought it might be worth while writing a blog post on some of the use cases we try to support, some of the functionality that perhaps isn’t very widely known as well as the reasons for why some aspects work like they do.

For a bit more background on the Sage 300 business logic have a look at this blog posting.

Libraries

Sage 300 ERP’s COM API can be used by any tool that understands COM and how to talk to COM objects. The first step is to add the COM object to your project. In VB6 you do this by going to Project – References and adding “ACCPAC COM API Object 1.0”. In some tools you can browse to the DLL and add that, in this case you browse to where ever you installed Sage 300 ERP and then browse for runtime\a4wcomex.dll.

Creating and Initializing

Once you have the library available for you, now you need to get started. All objects in our COM API are created via APIs in our COM API. But first you need to get started by creating and initializing a session object. This is the root object and from this everything else is derived. In VB there are a couple of ways to create the initial session object either:

Dim mSession As New AccpacCOMAPI.AccpacSession

Or

Dim mSession As AccpacCOMAPI.AccpacSession
Set mSession = CreateObject(“Accpac.Session”)

Once you have a session object then you need to initialize it:

mSession.Init “”, “XY”, “XY1000″, “61A”

If you are accessing the COM API from an external program and not an SDK application then the parameters don’t matter. The first parameter is for when an SDK application is run from the desktop to connect them up properly and the other parameters are similarly for SDK application for other APIs like getting you applications help files correctly. Generally for an external application you just want these set with valid value so things will proceed. The application ID “XY” is reserved for non-SDK application to use, so you don’t have any risk of having things confused with a third party application. It is important that you call init before doing anything else. If you do call some other method first then expect to get strange error messages.

Below is the object model of all the objects you can get from an initialized session:

objectmodel

Company List

At this point we still haven’t signed into a company. At this point you can really just sign-on, but you can also get a list of companies that you can sign-on to. This is the API used by Sage 300 to build sign-on dialogs. In the session object is an organizations collection that you can traverse to get the information on the available companies.

For i = 0 To mSession.Organizations.Count – 1
Print mSession.Organizations.ItemByIndex(i).DatabaseID,
mSession.Organizations.ItemByIndex(i).Name
Next i

As you can see by the code, this API was invented by a C programmer and not a VB programmer.

Signing On

The main way you sign-on to a company is to use the open method.

mSession.Open “ADMIN”, “ADMIN”, “SAMLTD”, Date, 0, “”

The main thing you need for this method is the user id, password, company id and session date. After calling this, the next thing you usually do is create a database link and then from the database link create your view objects. Now you can call the views and use all the Sage 300 business logic. The disadvantage of this method is that you need to know the user id and password. But otherwise you are good to go.

Session/Signon Managers

Of course with what we have discussed so far you could create your own sign-on dialog. But why re-invent the wheel. The main Sage 300 ERP COM library is intended to be called from both user interface programs or server processes, as a result it has no user interface functions itself, it will never popup a messagebox or a dialog box. It is strictly processing and no UI.

However we do provide a number of other ActiveX controls that are intended to be used as UI components. Two of these are the Signon Manager and the Session Manager. You only interact with the Session Manager and then the Session Manager uses the Signon Manager whenever it needs it.

So if you don’t want to have to know the user id and password then you use the Session Manager to create your session for you and you get back a session that has been created, initialized and opened for you. The user will be able to enter their user id, password and select the company and session date to use for processing.

To use the Sesion Manager you need to add a reference for “ACCPAC Session Manager 1.0” or access the runtime\a4wSessionMgr.dll. Then you would write some code like:

Dim signonID As Long
Dim mSession As AccpacCOMAPI.AccpacSession
Dim sessMgr As New AccpacSessionMgr

sessMgr.AppID = “XY”
sessMgr.ProgramName = “XY1000″
sessMgr.AppVersion = “54A”
sessMgr.CreateSession “”, signonID, mSession

The intent of the session manager was to facilitate things like workflow management. So the first time someone accesses it, it will create new session and the user will get a signon dialog. However the next time it is accessed, you will just get back the session the user opened the first time. This allows applications to be strung together in a workflow type manner without each step requiring the user to sign-on. If you do want a fresh sign-on, you can set the ForceNewSignon property to true. If there are two desktops signed in and ForceNewSignon is false, then the user will get a dialog box to choose which session they want.

Summary

The external APIs to Sage 300 ERP are very powerful. Since the AccpacCOMAPI is used exclusively by our VB forms to access the Sage 300 business logic, you know that from this interface you can do anything that can be done from a regular UI. All business logic is exposed this way. So the intent of this posting was just to give you a little help in getting started to get at all that business logic.

The Road to DevOps Part 2

leave a comment »

Introduction

Last week we looked at an introduction to DevOps and concentrated on the issues around frequently deploying new versions of the software. This week we are continuing to look at DevOps but concentrating on issues with maintaining and monitoring the system during normal operations. This includes ensuring the system is available, provisioning new users, removing delinquent users and generally monitoring the system and ensuring it is healthy.

SLAs

Generally everyone wants a service that is always available always healthy and working well. But this is too vague a statement and the reality is that things happen and need to be dealt with. This has to be acknowledged up front and strategies put into place to deal with them. First you need stronger guidelines and this usually starts with a Service Level Agreement (SLA) that is laid out for your customers. This details various metrics that you are promising to achieve and what happens when you don’t. Generally you need a good set of performance metrics to judge your service against.

Some of the common performance metrics are:

  • Throughput: System response speed.
  • Response Time: How quickly will a given issue be resolved?
  • Reliability: System availability.
  • Load balancing: When elasticity kicks in.
  • Time Outages: Will services be unavailable during that time?
  • Service Slow down: Will services be available, but with much lower throughput?
  • Durability: How likely to lose data.
  • Elasticity: How much a resource can grow?
  • Linearity: System performance as the load increases.
  • Agility: How quickly the we responds to load changes.
  • Automation: Percent of requests handled without human interaction.

Even if you don’t publish these metrics externally you need to track these to know how you are doing.  Generally a DevOps team takes the approach of continual improvement (like Kaizan). A good DevOps team has dashboards that track these metrics and are always looking for ways to improve them.

Monitoring

A basic rule with cloud applications is that you need to instrument and monitor everything. First this allows you to generate your SLA dashboard and ensure you are meeting your SLA. Second this lets you provide feedback back into development on what is working well and what is working badly. For instance you can track how much a given feature is being used, or perhaps how many people start using a feature, but don’t complete the operation. This could highlight a usability problem that needs to be addressed.

Similarly for performance optimizations. You don’t want to bother optimizing something that is infrequently used. But form good monitoring, for instance you can see a query that is being run very frequently and not delivering good performance. Attacking this would be helpful, both for people issuing the query and for other people perhaps slowed down while these slower queries are being processed.

The key point being to address what really matters, based on hard facts gathered by good instrumentation on what is really affecting your users. Perhaps you don’t need a monitoring center like the one below, but it sure would be cool.

Provisioning

Another operation that hopefully is going on at a rapid pace is provisioning new users. And then the reverse, hopefully at a very slow pace, is removing users. In normal operations, users should be able to sign up for your service very easily, perhaps filling out a web page, and then acknowledging a confirmation e-mail. All this should be done pretty much instantly.

What should not happen, is that the user fills out a web form, which is then submitted to a queue to be processed, then in the data center, someone reads this request and performs a number of manual steps to setup the user. Then hours or days later an e-mail is sent to the customer letting them know they can use the service.

This is really a matter of a DevOps team’s focus on automation. Chances are the steps of the manual process need to be performed, which is ok, as long as they can be automated (scripted) and performed automatically quickly eliminating any time dely. Generally any DevOps team is always looking to find any manual process and eliminate (automate) it.

Elastic Operations

Most people don’t buy their own data centers or their own server hardware anymore. Especially when starting up, you don’t want to make a huge investment in capital equipment. Most people use an IaaS or PaaS service like Amazon or Azure. These services are “elastic” in that you can run scripts to add capacity or remove capacity so then stretch and shrink with demand. You pay for what you use, so for each server you have running in this environment, you are paying some fee.

Generally a DevOps team should be monitoring the system load and when it hits a certain level, scripts are run that create a new server and add it to the system, so the load is shared by more computing resources. By the same token when usage drops, perhaps on the weekend or late at night, you would like to drop some of these computing resources to save money. Again the DevOps team needs to develop the necessary programs and scripts to support this sort of operation in an automatic manner (you don’t want to be paying someone to juggle these system resources). Adding resources is usually easier since they come up empty and once they are known to the load balancer they will start being used. When shutting down, you have to monitor and ensure no one is using the system before shutting it down (or have a method to move the active users to another server). Often this is done by stopping new requests going to the server and then just waiting for all the users to logoff or become inactive. Generally if your application in completely stateless, then this is all much easier.

Disaster Recovery

It is also the responsibility of the DevOps team to ensure there is a good disaster recovery plan in place. For instance the Azure and Amazon services have multiple datacenters. You need to control how you application is deployed to these and how backups and redundancy is managed. Generally the higher level of redundancy and the quicker the switch over can cost more money, so you need to make sure your plan is sufficient, but not overdone.

Suppose you use Azure or Amazon and even though you have a redundant deployment to multiple datacenters, the whole service goes down? Some companies actually have redundancy across IaaS providers so if Azure goes down, then they can still run on Amazon. Practically speaking, unless you are very large or have a very tight SLA, this tends to be overkill. Generally you just put in your SLA that you aren’t responsible for the provider being systemically down.

Summary

The transitions from Waterfall to Agile development was an interesting one with a lot of pitfalls along the way. The transition from Agile development to DevOps is a bigger steps and will involve many new learning opportunities. It takes a bit of patience, but in the end should lead to an improved Development organization and happier customers.

 

Sage 300 ERP 2012 RTM

with 3 comments

Yes, Sage 300 ERP 2012 has been “Released to Manufacturing”. In a way this is really a “Release to Marketing”, since we don’t really manufacture much anymore, it gets posted for download and then sales and marketing takes over. I think everyone prefers keeping the acronym RTM rather than changing to RTW for “Released to the Web”. I previously summarized all the great things in the release in my Sage 300 ERP 2012 posting.

It’s been a lot of hard work and a tumultuous journey since we release 6.0A at the end of 2010. But we are really happy with the release, it includes many useful new features as well as building on a number of foundations ready for future development.

Now that we are RTM, business partners should be able to start downloading it on Sept. 5 and DVDs should be available by Sept. 18.

Rebranding

Sage Accpac ERP is now release with the new Sage branding and is now Sage 300 ERP 2012. This means we now match the revamped Sage web site and fit in nicely with all the new sales and marketing material. Hopefully now we can fully leverage and build on the Sage brand to ensure people are familiar beforehand with our products.

In addition our editions are changing. It would be confusing to have Sage 300 ERP 200 Edition 2012. So 100, 200 and 500 editions become Standard, Advanced and Premium Editions. Hence something like Sage 300 Advanced ERP 2012.

Manufacturing

When I started with Computer Associates on the original CA-Accpac/2000 project, manufacturing was a much bigger deal than it is today. In those days we produced a boxed product that consisted of floppy disks, printed/bound manuals, many special offer cards and the copy protection dongle all shrink wrapped in plastic.

Back in the 90s we had quite a complicated schedule of when everything had to be submitted so that it could all come together on our release date. For instance manuals took 1 month to get printed, and disks took 1 week to get duplicated and labeled (if we were lucky). So the technical writers had to be finished a month ahead of the programmers. Similarly any included marketing material, as well as the design for the box had to all be submitted quite early.

Back then we released on 3 ½ inch 720K floppy disks (they were actually in hard plastic by this point). Each module took 6 or 7 disks, so you had a stack of disks for System Manager, a stack for General Ledger and so on. Generally a single 720K floppy was quite a bit more expensive than a blank DVD is today.  (In fact the first version of Accpac was released on 8” floppies for the North Star CPM computer, but that was before my time).

After we shipped the gold master floppy disks off to manufacturing, we still had one week to QA while they were being duplicated. We would continue regression testing through the week looking for any more serious issues. If something was found, it was quite expensive, since usually any manufactured floppies were thrown away and new ones were duplicated.

For a while we produced 5 ¼” floppy disks which were available by demand. With version 3.0A we switched entirely to CDs, but we still shipped one module per CD. With CDs it then became practical to provide things like PDF versions of manuals on the CD along with other extras that were impractical on floppy disks.

One thing with having all the modules on separate CDs was that we could stagger the release, so we would release first perhaps SM and the financial modules then the operation modules a few months later and the Payroll modules a few months later still and various other things even later. The end result being that when we first announced RTM on a version, then it would be nearly a year before all the modules, options, integrations, translations, etc. were all fully released.

Now there is only one RTM for a version and this RTM includes everything on one download image (or one or two DVDs). This includes all ERP modules, all documentation, all options products, translations in five languages and all integrations (like CRM and HRMS). So now when we RTM, a customer knows that all Sage components they need are ready and they can go ahead and start the upgrade process. We also work with all our ISVs to try to get their products certified for the new version as quickly as possible.

These days everything is on-line, so the web site needs to be ready to link to the new release and then we provide the download images that are posted there. We still produced a gold master DVD, since people can order these if they want them (for a fee).

Release Cycles

Although not visible outside of development, we also run our release cycles quite differently now than we used to. In the early version all the coding was done first, then when we decided it was code complete we threw it over the wall to QA and went through a long find and fix bugs phase. Generally we shared QA with Sage 50 Canadian (then known as Simply Accounting) and one product was QA’ed while the other was coded.

Now we use an Agile development process and QA is involved from the start and there is no separate development and QA steps. Nothing is considered code complete or done until it is fully QA’ed and documented. Generally this lead to more accurate schedules and higher quality products.

Summary

We are very excited to be releasing Sage 300 ERP 2012. We hope that people upgrade to it and enjoy using it. We are also excited to be starting work on the next version which also looks very exciting.

 

Written by smist08

September 1, 2012 at 5:04 pm

Follow

Get every new post delivered to your Inbox.

Join 276 other followers