Stephen Smith's Blog

Musings on Machine Learning…

Posts Tagged ‘sage erp 300

SData Enhancements for Sage 300 ERP 2012

with 4 comments

Introduction

I’ve previously blogged on the enhancements for the framework for creating custom SData feeds for applications here and here. In this posting I’m looking at enhancements to our core SData protocol support. We’ve been working hard to add more features and correct some inconsistencies and bugs.

The main purpose of this exercise is to make SData work better for integrators and to make sure the Sage 300 ERP SData feeds work well with other Sage tools like the Sage CRM dashboard and the Sage Argos Mobile Framework.

Global Schema

Generally SData features are mostly of interest to programmers. However some, like this one, enhance existing integrations between different products. Global schema is a mechanism to return all the SData meta-data for a dataset (company) in a single call. In version 6.0A, you could only get the metadata for one SData resource per call. Rather esoteric. But having this enhances our integration to the Sage CRM SData dashboard. Previously when you created an SData widget pointing to an Sage 300 ERP SData feed you needed to specify the $schema for a specific feed, something like:

http://sage300erpdemo.na.sage.com/SDataServlet/sdata/sageERP/accpac/SAMINC/arCustomersFinder/$schema

Now you can give the $schema to the company using an URL like:

http://sage300erpdemo.na.sage.com/SDataServlet/sdata/sageERP/accpac/SAMINC/$schema

Which means you don’t need to know the resource component of the URL. In Sage CRM it looks like this, first you give the URL to the global schema:

Then you get a list of SData resources to pick from in a more human readable form:

Previously you only got the feed you specified. Then you select a feed and hit next to choose the fields you want from that feed.

SData Validation Tool

Sage has written a tool that will validate the correctness of SData feeds. This tool is available here (you need to scroll down to near the bottom of the page). The intent of this tool is for anyone, whether internal or external to Sage to be able to validate any Rest web services for SData compliance against what is described in the spec at the SData Website. This tool was around in the 6.0A days, but it needed work. Back then 6.0A passed the feed validator. With the new tool 6.0A has a lot of problems reported against it. With 2012, quite a bit of work went into making our feed compliant. Which means you can expect them to work as the specification states and integrations with other SData aware products and tools becomes much easier. This tool is continuously being updated and will probably auto-update itself as soon as you install it. Below is a screenshot. Hopefully by release a few of the remaining errors will have been corrected.

Argos

Argos is a framework for creating mobile SData clients using HTML5, JavaScript and CSS. This was originally developed by the Sage SalesLogix group to create the mobile interface for the Sage SalesLogix Mobile product. However since SalesLogix uses SData as its Web Services interface, this library was created entirely on SData. As a consequence it can be used with any product that supports SData.

As part of our Sage 300 ERP 2012 development we tested Argos on our SData feeds and produced a sample mobile application.

As part of this development we fixed a couple of bugs and made sure our SData support works well with the Argos SDK. I’ll write a future blog posting on more details on the Argos SDK and how to write mobile applications for Sage 300 ERP. However if you are interested in Argos, you can check it out, since the whole project is open source and available on github:

E-Tags

We finished implementing e-tags with this version. These allow proper multi-user control when you have multiple sources updating records. Basically when you read a record, it returns an e-tag which has the date and time the record was last modified. When you update the record this e-tag is included with the update message and then the server can see if the record has been modified by someone else since you read it. This detects the multi-user conflict. Sometimes the server will just merge the changes silently and everyone is happy, sometimes the server will need to send back the dreaded “Record has been Modified by Another User” error response.

Using Detail Fields in Select Clauses

In 6.0A, if you read an SData feed with header and detail components (like O/E Orders), then you got back the header fields and links to the details. Even if you specified detail fields in a select clause. This meant if you wanted both the header and detail lines you needed to make two SData calls. Further this was annoying because it meant the format that you get back reading records is different than how you write the records, so you would need to combine the separate results from the header and details back  together to do an update or insert. Now if you specify detail fields in the select clause you will get back all specified fields in the XML package, which will likely be a header with multiple details all returned for the same call. This saves an SData call, but further it’s much easier to deal with, since now you have an already correct XML template for manipulating to do future inserts and updates.

Define Your Own Contracts and Have Your Own Classmaps

In version 6.0A, the only contract we supported for SData feeds created by Sage 300 ERP was the accpac contract. Now in the classmap file you can specify the contract your feeds belong to. This has always been in the classmap files, it just didn’t work. This means you can ensure any feeds you define yourself won’t conflict with anyone else’s.

Another problem in 6.0A was that to create your own feeds, you either needed to be an activated SDK application with your own program id, or you needed to edit one of our existing classmap files. This was annoying since your changes could well be wiped out by a product update. Now you can copy your own classmap file into an existing application (like O/E), you just need to name it classmap_yourownname.xml and it will be added to the defined SData feeds.

Further all the feeds that make up the accpac contract were oriented to the Orion UI work. They weren’t necessarily a good fit for people doing general integration work. So we are creating a new contract which will have the main master files and documents in it that is oriented towards integration work and stateless operation.

Summary

SData continues to be a foundation technology that we are building on. Quite a lot of work has gone into improving our SData support in Sage 300 ERP for the forthcoming 2012 release. This will allow us to leverage many other technologies like Argos to accelerate development.

If you are interested in learning more about SData check out my learning SData videos which are located here and which I blogged about here.

 

Written by smist08

May 26, 2012 at 4:31 pm

Sage 300 ERP 2012 Payment Processing

with 4 comments

Introduction

We introduced an integration from Sage 300 ERP (Accpac) to Sage Exchange in version 6.0A (with a retrofit to 5.6A). This integration allows ERP users to take credit card transactions directly from ERP screens including pre-authorizations and charges. I blogged about this in these two articles: Accpac Credit Card Processing and Accpac Payment Processing.

Now as we approach our next release we are including a number of enhancements to this integration. We are in the process of changing our version numbering scheme, so the next release of Sage 300 ERP will be Sage 300 ERP 2012 rather than 6.1A. However it is still the next version of Sage 300 ERP after 6.0A.

With this upcoming release we are going to add three main features:

  • Ability to capture pre-authorizations in Shipment Entry, Invoice Entry or either. Currently users can only capture pre-authorizations in Shipment Entry when items are shipped.  Many customers tell us that they would prefer to have office personnel perform the capture rather than have this done during Shipment Entry.
  • Ability to capture a number of orders from different customers in a batch rather than individually. This will streamline operations, especially in high-volume companies.
  • System will automatically ‘force’ an expired pre-authorizations without prompting the user whether they want to force a pre-authorization. “Force” is the process of doing a capture (post-authorization) for a pre-authorization that has expired. Currently there is a prompt that appears if it has expired and users have to select whether they want to force a payment.  This change streamlines operations and removes unnecessary user interaction with the software.

Capture Pre-Authorizations During Invoice Entry

This feature basically means exposing the functionality already available from Shipment Entry in Invoice Entry. Capturing a pre-authorization, really just means charging the credit card for real, so you get paid. The earliest you are allowed to capture (or charge) the credit card is when the item ships as per credit card processing rules. However due to separation of duties, in many companies the people doing the shipping aren’t the right people to process the credit card. Usually this needs to be done by a finance person and it is most convenient for them to do this when they prepare the Invoice (since the shipment has already been done).

Here is the Invoice Entry screen displaying Invoice Prepayments screen with full credit card functionality. Notice the “Capture…” button on the main form.

When you hit Charge, you get the “Capture Pre-authorization” screen:

And then you when you hit “Process Payment” it will capture the Pre-Authorization, so you will be paid.

Capture a Batch of Orders

Now, let’s look at how we will “capture” a number of orders in a batch. For any orders that have been pre-authorized, this means to really charge their credit card. To do this we have create a new Form in the Order Entry Transactions folder:

When you run this Form you get:

From this screen you specify the A/R Batch to add the transactions to (or create a new one). Then you can get a list of Orders which are candidates for charging. To be in the list, the Order must have an outstanding pre-authorization and have shipped. It is a rule from the Credit Card companies that you can only charge for items that have shipped to the customer. Select which Orders you want to capture (charge). Once you have chosen all the Orders, then all you need to do is hit the “Process” button and away it goes.

This should make it easier for companies to process a high volume of Orders.

Automatically “Force” Expired Pre-Authorizations

In the current system if a pre-authorization has expired then we put up a yes/no question when you go to capture the transaction asking whether you want to “force” it. Forcing a transaction may not work for various reasons and usually causes higher transaction fees on the transaction. This is why we put up the warning dialog, so if people don’t want the extra fees then they can avoid them.

However the feedback we have received is that this prompt is just annoying. If you are taking credit card transactions then you are willing to put up with the fees and you would like to try to get any money you can. For instance if the transaction fails because they have maxed their credit card, well if you hadn’t tried, you wouldn’t get anything anyway (or would get something like a bad check which has its own fees).

We may offer an option for this, but it seems like the consensus is that people would like the process streamlined.

Summary

The original credit card processing support that we previously added has been quite successful and we are looking to build on this by continuing to add functionality in each release based on customer feedback. Hopefully these new features will keep our Sage Exchange integration growing and as it grows we will get more feedback and enhance the integration further. Notice that sometime listening to feedback means streamlining a process or removing a feature, not just adding new bells and whistles.

Written by smist08

April 21, 2012 at 4:12 pm

Five Whys

with 3 comments

Introduction

The “Five Whys” is a method for performing a “Root Cause Analysis” to detect the true cause of problems in a product or process. The Five Whys was developed by Taiichi Ohno, the father of the Toyota Production System and is a cornerstone for how Toyota achieves the quality it does.

Software development is a complex process and modern business software is quite large and complicated. If something goes wrong, we have a tendency to just blame the first named culprit, for instance if a bug appears in the field then blame the QA person that tested that feature. Then this is usually accompanied with a lot of talk about people being “accountable” for the work they do. But if you step back a bit you realize that actually a much larger system has failed. The QA person wasn’t the only person to look at that feature. The programmer was responsible for testing it (both manually and providing unit tests) before giving it to QA. A software architect reviewed what the programmer was doing. The entire product went through a regression test before release. This feature was reviewed by a Business Analyst and a Usability Analyst. The product and feature had gone through a customer Beta Testing phase. And then in-spite of all this process and review by all these responsible and accountable people, the bug still made it out into the field. So to just superficially blame one person doesn’t give a good perspective on what went wrong and doesn’t help preventing the problem happening in the future.

The real goal of these methods is to allow companies to learn from their mistakes rather than to assign blame. Generally the process works way better when everyone is confident that this is the real intent. If everyone believes the goal is to assign blame, then the whole process goes political and getting to the truth is usually impossible.

Learning from mistakes has its own pitfalls. One common theory for why successful companies eventually fail is that they dutifully analyze problems and learn from their mistakes. For every problem that happens they develop a process or procedure to prevent that problem ever happening again. The downfall here is that as time passes, the number of processes and procedures created from this process becomes huge. This makes the company very bureaucratic and getting things done very hard. Great care has to be taken that the outcome of these studies doesn’t just bury a company in procedures that get harder and harder to follow. The best result is if the current process can be simplified, perhaps eliminating the step that created the mistake. After all simple processes and procedures are much easier to follow and less error prone. When developing remedies to problems analyzed, careful attention has to be paid to the cost of the solution, so that the solution isn’t worse than the original problem.

Performing a Root Cause Analysis study takes time and attention. So when you want to get started with this, start with the worst and most easily defined problems first. Run a few pilots to get used to the methodology. One good place to start is problems found in the field by customers, as mentioned before these indicate a fairly serious systemic failure which might be useful to fix. Problems found in the field are usually categorized by severity, so you could start by looking at the most severe problem first and then working down the list.

The Five Whys

The basic concept behind the Five Whys is really simple. Basically keep asking why for five times to get past the superficial reasons to the underlying reason. It’s something that we all did as three year olds when we kept annoying our parents by asking why over and over again. Basically as children we perceived that this was a good way to get a deeper understanding of the world, but then we abandoned this technique as we got older (or were scolded enough times). Basically the insight here is that we had it right as three year olds and shouldn’t have given up.

As an example perhaps of an interviewer asking a programmer:

  • Why did this defect make it into the product?
    – Because the code review and unit tests didn’t catch it.
  • Why didn’t the code review catch it?
    – Because we substituted a reviewer from another team.
  • Why did you substitute who was doing the reviews?
    – Because our team was frantically coding and couldn’t spare the time.
  • Why were they so busy coding?
    – Because the stories they committed to were larger than expected.
  • Why was that?
    – Two key dependencies were underestimated and forced a lot of extra work.

This is a fairly simplified and abbreviated example, but shows the basic idea. Also notice that each question can often lead to several avenues of pursuit for the next level. It’s up to the interviewer to decide whether to pursue several or follow the main thread. The diagram below shows how this might go.

Root Cause Analysis

When you perform a Root Cause Analysis (RCA), you usually follow the following phases:

The Five Whys are usually performed in the “Data Analysis and Assessment” step. The “Data Collection” step before is used to identify the correct people, so you know who to ask the Five Whys to. Typically you want to have a separate Five Whys interview with each of the people involved with the problem and then perhaps additional people depending on the outcome of the interviews.

Generally you do each interview separately and document all the questions and answers. We document everything to do with the process on a Wiki, so everything is kept transparent and visible. After all the interviews you then need perform the assessment to identify the root causes and choose the items you want to pursue. This is usually done in a couple of brainstorming sessions. First everyone studies the documented Five Why interviews then they get together to discuss what happened and make suggestions of how to change processes or make other changes. All these ideas are documented. Then some time is left to think about them and another meeting happens to decide which ideas will be implemented, keeping in mind that we don’t want solutions worse than the problem and don’t want to introduce unnecessary bureaucracy. Then we apply the solutions and inform all affected departments. Follow up is scheduled to ensure that the solution is working and we are getting the desired results, if they aren’t working they should be scrapped or changed.

Summary

Usually when people first read about RCA they take it as a very heavy handed procedure to address simple problems. However when you’ve run a few of these, you quickly realize that most problems aren’t that simple and that solving systemic problems can be quite difficult. I’ve found Five Whys an extremely effective method to get to the bottom of things, where one of the key benefits is that it forces people to take the time to really think about what happened. Then the documentation produced provides a lot of visibility into what happened which usually makes implementing the solution face less resistance.