Stephen Smith's Blog

Musings on Machine Learning…

Archive for the ‘TCO’ Category

Customization: A Two-Edged Sword

with 2 comments

Introduction

When implementing a mid-market ERP system, it’s often suggested that the base ERP should provide 80% of the needed functionality and then the other 20% is provided by customization. The rationale behind this is that although GAAP (Generally Accepted Accounting Principles) is standard, each business is unique and has unique requirements in addition to GAAP, especially in modules outside of the core financials.

All Sage ERP products offer quite high levels of customization ability. Generally we use this as a competitive advantage to make sales, since we can more closely match a customer’s requirements. However some customizations can lead to quite high costs and problems down the road (perhaps a version or two later).

As more and more customers are looking to move to the Cloud, we need to rethink customization and what we are trying to achieve. This blog post looks at some of the problems with Customization and some things to consider when recommending these. These apply to both hosting current ERP’s in the Cloud as well as to how we have to think about Connected Services and Features as a Service (FaaS) in the Cloud.

Problems

This is a rather unfair list of many of the problems we typically see with customizations:

  1. Cannot upgrade to a new version of the software because it means doing the customizations all over again.
  2. An initial product implementation fails because the costs and delivery of the customization go over budget and behind schedule.
  3. Some customizations cause system stability issues. For instance adding SQL triggers to the database. Then if the trigger fails, the transaction that invoked it fails with a really hard to diagnose error.
  4. Some customizations cause performance load problems. Perhaps a custom report or inquiry causes a SQL query to run that takes hours to finish, slowing everyone else in the meantime.
  5. In a hosted version everyone runs the same programs, so some customizations can prevent you moving to the cloud.
  6. Extra testing is required for every product update or hotfix to ensure they don’t interfere with customization. This restricts the use of things like automatic updates.
  7. Getting support and diagnosing problems. Customer support has a hard time diagnosing problem in and around customizations because they don’t know what the customizations are meant to do and what they might affect.

Moving to the Cloud

As we look to move more customers to the cloud, we have to deal with all these customization issues. Some level of customization is necessary, so the question to some degree becomes how much? In the cloud we want everyone sharing the same set of programs, so no customized DLLs or EXEs. In the cloud we want everyone to immediately receive hotfixes and product updates, so customizations must be upgrade safe. Further when a new major version comes out, we want everyone in the cloud moved to this new version in a quick, transparent and automated fashion. Hence any customizations allowed in the cloud can’t prevent data activations and can’t require a lot of special testing for each new version.

Causes

So what causes many of the problems above? A lot of customization can be a sign of an initial incorrect product choice resulting in trying to fit a square peg into a round hole. It can also indicate an overly eager to please sales team promising the product will do many things that it normally doesn’t. Plus many companies main line of business is developing customizations, so providing these is a major source of revenue.

A big cause of problems with upgrading customization is the result of database schema changes in the core ERP. This is usually the result of adding new features or streamlining functionality for performance and scalability reasons.

Often business logic changes can have unforeseen effects on customizations, even though the database schema doesn’t change, perhaps a customization relies on something in the way the business logic used to work.

Solutions

As solutions move to the cloud the nature and scope of customizations is changing. This affects both vendors like Sage, in how we need to make sure all customizations can be automatically updated and it affects customization consultants now that you can’t install EXEs or DLLs into the ERP or CRM system itself. So what do we do, since there are still vital business needs that need to be addressed?

First off, we as a vendor have to be much more disciplined in how we change our applications from version to version. Long gone are the days when we could just make any change we wanted and then throw it over the fence to the Business Partners to figure out how to accommodate in the whole eco-system. This made the cost of upgrading quite expensive and detracted from the overall customer experience because generally upgrades introduced all sorts of problems that then needed to be stamped out. As we begin to automatically update people, both in the cloud and on-premise we have to ensure the whole process is automatic and seamless. We have to:

  • Maintain application API compatibility so customizations and ISVs continue to work.
  • Limit schema changes so that we don’t break customizations or ISVs. Mostly just add fields or tables and even if fields are no longer really used, leave them in for compatibility.
  • Provide tools to automatically convert anything beyond the database that is affected like Crystal Reports or VBA macros, so no additional work is required.
  • Ensure that we can upgrade independent of database state, i.e. don’t require that all batches be posted or such things.
  • Work closely with ISVs and Business Partners through alpha, beta and early adopter programs to ensure the entire ecosystem won’t be disrupted by new features and versions.
  • More tightly control the customizations that are allowed in the cloud. Even for custom reports and VBA macros, have rules and monitoring procedures so that badly behaved ones can be found and removed (or fixed).

When we are running our application in the cloud, how do we perform common integration type customizations? A great many of our clients create programs to feed data into and out of an ERP to integrate to things like perhaps a custom subscription billing service, a company shopping web site or one of any number of things. Since you can’t install EXEs or DLLs and even if you could do this via a VBA macros, chances are external communications would be blocked by a firewall, so what do you do? For these cases you have to switch to using Web Services. Which in the Sage world means SData. With SData you will be able to create Cloud to On-Premise integrations or Cloud to Cloud integrations. Basically you are removing the requirement that the two applications being integrated are on the same LAN.

Summary

Changing the thinking on customization will take time, but the industry and customer expectations are changing. Sage and its Business Partners need to be more disciplined in how we all customize to ensure we don’t create longer term problems for customers. In the Cloud world these problems show up much more quickly than in the on-premise world. We have to re-evaluate the technologies we are using and re-evaluate how our common customization design patterns are changing.

I tend to think once we get through some transitional pain, that we will come out the other side with a more powerful and more sustainable customization model that will lead to a better customer experience.

Advertisements

Written by smist08

September 29, 2012 at 6:20 pm

Sage Visual Process Flows

with 7 comments

Introduction

Sage ERP X3 has had a feature called “Visual Process Flows” for some time now. These process flows help users navigate the product to perform their daily tasks. They are fully customizable and you can create these for all the roles in your organization. Below is a picture of a process flow in the Sage ERP X3 desktop.

However, this visual process flow tool wasn’t just built into the Sage X3 desktop. It was actually developed as a standalone component that could be integrated into any product. Now we are in the process of integrating this into a number of other Sage ERP products starting with Sage 300 ERP 2012.

Notice that there is now a “Sage Visual Processes” top level menu item on the Sage 300 Desktop, which contains the list of process flows that have been assigned to a user. To assign flows to users there is an Administrative Services program for this purpose. The shown example isn’t complete yet, but when Sage 300 ERP 2012 ships it will include a number of stock process flows. We can then augment this collection as we go along using Sage Advisor Update. If you have the proper security rights, you can right click on the item in the tree and select edit from the context menu to edit the process flow. The Process Flows we ship with go in the program files area as templates, and then any customized one go in a separate area in shared data. This way we can freely update the templates without overwriting customized flows.

Process Flows

The primary goal of these Process Flows is to allow users to quickly find their tasks and launch screens without searching through the ERP’s giant tree of icons. Within the Process Flows, it is easy to include much more descriptive text of what needs to get done. Most small business products like Sage 50 ERP present the user with this model of navigation. However for mid-market ERP, having a number of static hard coded process flows is inadequate. With this tool you can customize Process Flows for all roles that a mid-market customer requires. You have the ability to add your own graphics and to be quite creative in the creation and look of these flows.

You could argue that Process Flows doesn’t give you anything new since you are still running the same old screens and after all you can customize the tree of icons to limit what a user sees to run. However from all our usability studies, we find that even with this, users still spend quite a lot of time searching for the correct screen to run, especially ones that they run very infrequently. Plus, I think the Process Flows are quite attractive and more appealing than just displaying a list of icons.

You can run one process flow from another, so using these you can construct wizards to step people through longer processes. A great use is to create Process Flow’s for our old setup checklists. Another thing you can do is have a master process flow that drills down into more detailed process flows, creating a more graphical representation of our tree of icons.

If you are familiar with Sage ERP X3 and its Process Flows, then you should feel right at home with the tool in Sage 300 ERP. You can even export a process flow from one product and import it into the other, then you just need to change the Sage ERP X3 screen codes to the Sage 300 ERP screen codes (rotoids like OE1100). The actual Process Flows are stored in JSON format, which is a text file format which is easy to deal with. Along the same lines if you have taken the Sage University course on working with Sage ERP X3 Process Flows then you already know how to work with Sage 300 ERP process flows.

Sage 100/500 ERP

These are being fitted into the Sage 300 ERP Desktop first, and then later they will be fitted into the Sage 100 ERP and Sage 500 ERP Desktops. Later we will also consider running these from our Sage 300 ERP Web Portal. For that matter we can look at other places to run the Process Flow tool, perhaps as a starting page for tablet applications for instance.

Sharing and Collaboration

I know partners like to charge for what they do, but it would be nice if there was a central place where Process Flows can be freely shared. I wonder if Sage started such a site; would many partners contribute to it? Would customers contribute to it? If there was a lot of contribution it could eventually provide a much larger library of Process Flows than anyone could develop individually. The stone soup approach has worked quite well in other areas, but I know when we tried this with Crystal Reports, it failed quite badly. Still crowd sourcing and open source techniques are a very effective manner to really build critical mass. Sage X3 ERP has over 150 process flows, the other Sage ERPs are starting from scratch but will leverage the X3 work to catch up. Hopefully with community participation we can exceed this by quite a bit.

Summary

Adding Visual Process Flows to all our products, not only makes them easier to learn and use, but it complements the various branding initiatives to start making all the various Sage ERPs become a family of related products. Plus this feature gives all our desktops a nice visual refresh to make them much more appealing and modern.

 

Sage Advisor PEP

with 4 comments

Introduction

Sage Advisor is an umbrella term for a number of technologies and programs that are being rolled into all Sage products over their coming releases. Previously I blogged on the Sage Advisor Update project here. In this blog post I’m going to talk about the Sage Advisor PEP (Product Enhancement Program).  The intent of this program is to actively gather program usage information to help Product Managers and Application Designers better focus their work and to do a better job designing and specifying new features for future versions.

This sort of information gathering is becoming very common in the software industry. Microsoft has a very extensive program that they call their Customer Experience Improvement Program.  Mozilla Firefox has a telemetry program to gather performance data. Cisco has their Smart Call Home functionality. All SaaS applications do this big time. Every SaaS application logs every call to the web server and then can archive and mine this data endlessly. For SaaS applications you don’t have a choice, since you need to talk to the Web server to talk to the application.

It’s important to remember that participation in this Sage program is purely voluntary and easy to opt out of. Further no actual data from your database is ever transmitted. We are also subject to various governmental privacy laws such as HIPAA.

This feature has been around for a while now in one form or another. We introduced the “Call Home” feature in Sage 300 ERP 5.6A. This feature sent back information on which modules a customer had activated. It was a one-time message that was sent a few months after a new version was installed and activated. With version 6.0A we introduced PEP level 1, which sent similar information to Call Home but was sending it to the central Sage collection server rather than a special one only for Sage 300. With the forthcoming Sage 300 ERP 2012 release, we’ll be implementing level 2 which sends more usage data as explained below.

To do detailed user testing, a usability lab gives the best results, but this is quite expensive and time consuming for customers. The hope here is to virtualize some of this process and get a lot of useful data without all the manual work.

Goals

The goal of this project is to provide better information to our Product Managers, Usability Analysts, Business Analysts and other developers on how real users use our products. We need to know where users are spending their time, where they are productive and where they aren’t productive. We need to know which parts of the program are working well and which parts are causing problems.

Basically we want to guide our design and efforts based on data and not opinion. This is one of the methods we are using to gather real customer usage data.

Data we are Gathering

One of the things we want to determine is where users spend their time, so we are gathering data on what screens the user starts and how long they are in that screen. From this we can get hard data on which are the really heavily used screens and then spend more effort on improving these screens. Generally we know some screens users spend a lot of time in, like Order Entry, but we are looking for surprises here. Further we can see what combinations of screens people run, so if they always run A/R Customers at the same time as O/E Orders, then we can infer there is information in this screen required by everyone doing Order Entry and that to improve the workflow we should make this information more readily available in Order Entry. Generally this is a matter of simplifying workflows and making our customers more productive.

We want to simplify the parts of the program where users are having difficulty. To do this we are recording the usage of the Help. Basically recording all links to the help, this way we can determine the parts of the program that people are finding difficult and having to consult the help. Then we can work on the associated forms to make them more intuitive, so the user doesn’t need the help anymore.

Along the same lines we are recording all error messages displayed. This is to see if we can change the workflow, so the user doesn’t get errors. Also if we can pro-actively avoid error situations we hope to avoid a lot of support calls. For instance if after installing many people get a certain error that indicates things aren’t setup correctly, can we modify our installation program so people won’t run into this?

Big Data

Many Sage corporate presentations start with a slide proclaiming we have 6.3 million customers. This is great, but now with Sage Advisor PEP that means we now have 6.3 million customers sending usage data to a corporate web server and all this data needs to be recorded and analyzed.

This starts to put us into the world of “Big Data”. I blogged about Big Data and ERP here. Currently we are gathering all the data into SQL Server, but this is already strained with only a few Sage products contributing. We are already moving the data from the SQL Server to a NoSQL database to perform data analysis. As the volume of data continues to grow we will probably need to replace the SQL Server with something more scalable and this is a classic use case for a NoSQL database. To me this is an exciting initiative to use and become familiar with Big Data technology. As Sage moves forward this will become a more and more important technology to gain expertise in.

We do take care that we won’t delay people using their business application to send usage data. We always start a new thread or program to transfer the data so we don’t block the main program for the user as it upload data. Also we don’t consider this data “crucial” so we don’t need to worry too much if some is lost because the system is too busy.

Summary

Gathering usage data is becoming more and more common in the software industry. Sage is stepping up our efforts to gather good usage data from all our products. The primary goal of this is to feed this back into the organization to improve our products and processes. To become more scientific in the ways that we improve our products.

Written by smist08

March 25, 2012 at 12:53 am

Sage Advisor Update

with 9 comments

Introduction

There is a lot of debate on the relative benefits of installing software on-premise versus using a cloud or SaaS based product. Generally for SaaS based software you don’t need to manage backups or infrastructure, that is all done for you and you don’t need to perform software updates, they just continually happen in the background.  On-premise software then gives you more control of your environment, allows greater customization and you don’t need to worry about your data being in someone else’s hands.  The team at Software Advice wrote a good article on the benefits of on-premise software here.

One of the advantages of SaaS, is not having to install updates, just having them appear magically, ready for your use. On-premise software can do this as well via auto-updates. For instance the Google Chrome browser will silently update itself whenever it notices a new version is available and usually you never notice that it has updated itself. Other products let you set options on how they update or prompt you whenever an update is available, letting you choose at that point what you want to do it.

Sage is looking to add automatic software updates (similar to Windows Update) to all our products called “Sage Advisor Update”. A couple of Sage products like the Nonprofit Solutions already include this. Most other products will be rolling this out with their next release over the next year. When a Sage product with this capability installs, it will add a new Sage Advisor Update program to your Start Menu. Think of this like the Apple Software Update program. Below is a screen shot with some labels of this program:

This will give you a list of all the updates available for all your Sage products installed on this computer. From this program you can see what is available and then download the update and optionally also install the update. The Sage Advisor Update is aware of client/server type installations as well as security requirements for installation (like being an Administrator).

Starting Slowly

Sage has a large set of business applications, based on all sorts of technologies and using all sorts of customization schemes. Running auto-update on an ERP which has source code customizations will remove all those customizations. Installing a product update that includes Crystal Reports runs the risk of over-writing customized reports. Perhaps all the workstations need an update to match a server update and this needs to all be done at once rather than at the whim of a single workstation user. Plus most large business applications are integrated to all sorts of third party products and compatibility with all these needs to be checked.

These are all the sort of pain points that cause upgrades to be historically expensive. Ultimately we would like to be able to silently and frictionless install updates without any user intervention. But we will start slow. For the initial releases, Sage Advisor Update will notify you of updates and download them for you. However it won’t take the next step of automatically installing them. This then allows you to contact your business partner to confirm whether whatever needs installing is safe for you given the customizations, third party products or integrations you might be running. Additionally Sage Update Advisor is fully configurable so you can tell it what you want done or not done.

Perhaps it will be a while before we can perform a major version upgrade silently behind the scenes. However perhaps we can automatically install some critical hotfixes that we know won’t break customizations. Perhaps we can quickly reach the point of quietly installing product updates behind the scenes. Part of this is discipline on Sage’s part to be very careful in product updates that we aren’t doing anything that will break customizations.

Customizations

All Sage products tend to have upgrade-safe customization and upgrade-unsafe customizations. Upgrade safe customizations will not be broken by Product Updates and new versions of the software. The other will (might) be broken (or will break the new version). For instance for Sage 300 if you customize a Crystal Report in the directory where its installed, then it is not upgrade safe since it will be over-written by a new version. However if you place that report in a customization directory then it is upgrade-safe since it won’t be over-written. Similarly for Sage 100, if you use the customizer tool, then you are upgrade-safe, if you modify the source code, then you aren’t.

As we move forwards we want to make all customizations upgrade-safe. This way new versions can be installed at a far lower cost than they are today. Ultimately then customizations won’t be an issue to upgrading and won’t be an impediment to the Sage Advisor Update program.

Frictionless Upgrades

Ultimately we would like to make the whole upgrade process frictionless and remove the current large cost in upgrading customer’s business applications. Sage Advisor Upgrade provides a tool that can notify you when upgrades are available, download them and possibly install them. The next step is ensuring that our product updates don’t cause problems with customizations. Either by investing more in upgrade-safe customization technology or by having the product update itself update customizations to work with the new update.

Another aspect of this is to make sure any new features are easily and naturally accessible as well as very easy to learn. To make upgrades truly frictionless, they shouldn’t require a large training effort to get everyone to effectively use them. Other aspects of Sage Advisor technology has the ability to watch how you are using the software and offer training tips to take better advantage of what you have. A very large percentage of new feature requests we receive are for features that are already in the product, so we need to do a better job of helping people make use of these.

Downsides to Avoid

That’s not to say auto-update can’t be annoying. When I returned form African Insights 2012, I booted up my home computer and then it proceeded to install Windows Updates, Apple Updates, Flash Updates, Acrobat Updates, etc., etc. Basically it took over an hour to boot my computer hand have it usable. Perhaps I shouldn’t have clicked the buttons to let all these updates proceed, but I like to stay up to date. Especially annoying are updates that require a computer re-boot to complete.

In Africa, several people I met never run Windows Update. This is because they access the Internet over the cell phone network and have very low download limits. If they let Windows Update run then it uses up their download limit and Internet access becomes quite expensive.

Of course a worse case is having some favorite program break, so you can’t use it until the next update or you have to manually un-install it and then manually install an older version. This has happened to me with Google Chrome a couple of times. Worse of course is when Windows Update messes up and you are stuck with re-installing Windows.

We need to keep all these things in mind when we deploy updates, that we aren’t just putting a lot of noise in front of customers. That we are respectful of people’s time and the fact that they need to get work done with their computer. That they may have restrictions like bandwidth limits and will need ways to work around these. We especially have to increase our product quality so that our automatic updates don’t break things and don’t cause problems. A certain number of current upgrade difficulties is due to Sage pushing upgrade tasks to Business Partners rather than automating the process; now, Sage needs to spend the time and make the investments to automate all these tasks.

Summary

I just updated my iPad and iPhone to iOS 5.1 and this was a fairly painless procedure. It took the device a little time to download and install the update, but it did it all without intervention from myself. It will be nice when upgrading large multi-user business applications is just as easy.

Written by smist08

March 10, 2012 at 6:22 pm

ERP and Crystal Reports

with 3 comments

Introduction

I put a questionnaire on how people liked Crystal Reports up on the “Sage Partners, Employees & Alumni Networking Group” group on LinkedIn and received a lot of good discussion. I’ve previously blogged on customizing Crystal Reports for Sage 300 ERP (Accpac) here. This blog is more about the history of how ERP to Crystal integrations have gone in the past and the various challenges faced.

History

Accpac (now Sage 300 ERP) was originally developed by the Basic Software Group in the eighties in Vancouver. They started a project to have a new WYSIWYG report writer as part of the product rather than the codes based approach being used. This project was cancelled and some of the people involved quit and started their own company to develop Crystal Reports (then Quick Reports). This was originally developed as an add-in product for the MS-DOS based Accpac Plus. Then they generalized the product to dynamically look at any database and that became the Crystal Reports of today. The original founders then struck it rich when they sold their company to Seagate in 1994, which was trying to branch out from hard disks to software. Eventually Seagate gave up on this and Crystal went private and independent as Crystal Decisions. Shortly after in 2003 it was sold again to Business Objects. Then in 2008 Business Objects was bought by SAP where it lives today.

All of Sage 100, 300, 500 and X3 ERP use Crystal Reports as their main reporting engine. Sage HRMS and Sage CRM also use Crystal. We all adopted Crystal long ago at some stage or another. Sage 300 ERP adopted Crystal when we switched from using CARET (the Computer Associates report writer) back at version 3.0A.

Most products started integrating with Crystal in the 16-Bit world of Windows 3.1. We all integrated via the DLL interface that was to the DLL: crpe.dll. This interface let us print reports to a printer, preview or file. It let us set parameters that were passed from application’s entry forms to Crystal for things like to/from ranges. The Crystal runtime that included crpe.dll was installed to a directory c:\windows\crystal. This meant that any product that installed the Crystal runtime would usually overwrite what was already there, meaning the last product using Crystal to be installed would usually work, whereas most other products using Crystal would then stop working. This was the famous DLL hell.

In these days we had to use native drivers for accessing the database, Sage ERP 300 shipped two versions of its reports, one set using the native Btrieve driver and the other using ODBC for SQL Server. This tended to be a maintenance headache. Having multiple reports for different databases, or even different page sizes like letter vs. A4 vs. legal. Having different reports depending whether G/L is used or not. Over the years as Crystal functionality has improved and ODBC drivers have improved, we’ve been able to reduce the number of report versions. We now ship one version of reports using ODBC that is just configured to look at the correct database regardless of type.

Then we all started to move to 32 bit Windows with the advent of Windows 95. Crystal produced a 32-bit version and the new interface DLL was crpe32.dll. This was a close match to the 16-bit version and programs could work with Crystal in the 32 bit world very similarly to how they did in the 16 bit world. However DLL hell still remained. Plus we started to see the appearance of Terminal Server and Citrix. To keep users separated on TS, anything that a user installs to the Windows directory actually goes to a local Windows directory. The problem now is that the Crystal directory would go to the local user’s private windows directory and hence Crystal would only work for the user that installed the product and not for any other users on the Terminal Server. This then led to a manual procedure of copying the Crystal subdirectory either to the main Windows directory or each user’s local Windows directory.

With Crystal Reports 9, Crystal moved to eliminate the DLL hell problem. They dropped support for calling the crpe32.dll directly and moved the Crystal runtimes file over under Program Files under a version labeled directory. This allowed multiple version of the Crystal runtime to be installed at the same time and fixed the problems with Terminal Server and Citrix. However applications that used Crystal had to switch from using the crpe32.dll to using Crystal’s COM interface.

Crystal X and XI then followed and these worked very similarly to Crystal 9. Another nice thing was at this point Crystal had adopted an extensible file format, so the file format stopped changing between versions making upgrades easier.

Crystal Reports 2008 then dropped their COM API and now only supports integration via .Net and Java interfaces. This presented a number of problems, namely since these interfaces were quite different than the ones that went before them. Plus Crystal Reports was a fairly major upgrade to XI and seemed to introduce quite a few bugs. At this point ERP packages integrating to Crystal either had trouble with the new interfaces or had trouble with existing reports not running properly. This led to a bit of a low adoption of CR 2008. However the file format remained the same, so you could use CR 2008 to edit reports that would later be rendered in the ERP package via an earlier Crystal runtime, and again co-existence of multiple versions isn’t a problem due to the side-by-side DLLs.

Crystal has now released Crystal Reports 2011 and this fixes most of the problem in CR 2008. So we should start to see ERP packages moving forwards again. This version still only supports .Net and Java interfaces but has fixed a number of bugs and deficiencies that were causing adoption problems.

Challenges

For setting parameters and running reports, most ERP packages can do this fairly handily. However when you start to push the integration, what are some of the problems you can run into?

Sage ERP 500 (previously MAS 500) tries to simplify report customization by actually generating reports based on a higher level report designer built into the Sage ERP 500 product. This then eliminates a lot of the challenges in customizing reports inside the Crystal Designer. However this means that Sage ERP 500 uses the Crystal Design API and this API changes a lot from version to version making upgrades much harder. Sage ERP 500 also controls access to its database via a customer ODBC driver that then adds the Sage ERP 500 security model to the generic ODBC access.

Sage ERP 300 (previously Accpac) has a concept called “datapipes”. These were originally created for CARET as a performance enhancement and then taken over to Crystal. The concept is that you use these in place of ODBC to connect to the database and then these datapipes use the Sage 300 ERP database ERP to access the database. This allows the datapipes to use special knowledge of the Sage 300 ERP database schema to provide faster access. We use them to “flatten” out subreports, since accessing subreports in Crystal can be slow. We also use these for controlling access to things like G/L accounts with our G/L security module so users don’t see data they aren’t entitled to.

Reports Server

Using the Crystal Reports runtime from an ERP package is fairly straight forwards and easy, it can be installed with the ERP package and usually works quite well, so the user doesn’t have any extra hardware, installation or management problems. If you need to customize reports you need to buy and install Crystal Reports, but to just run reports there is no extra overhead. However much new development by Crystal is requiring the Crystal Reports Server.

The Crystal Reports Server is a central server that manages the printing and distribution of reports. You can use it to schedule when reports run and who gets the results. Most new Crystal development is now requiring a Reports Server to operate. For instance if you want to use the new Crystal Data Universe to make creating reports easier then you need a Reports Server to hold these. Similarly several new technologies like Crystal’s dash-boarding technology require the Crystal Server.

For us ERP vendors, we now have to evaluate the ROI for our customers. Is requiring this new server worth it? Is it worth the hardware, setup and licensing costs? Long term, is this tying the ERP package too closely to Crystal making adopting other technologies harder? At this point I don’t have a clear answer to these questions, but we are working with the new things in Reports Server to see their costs and benefits.

Summary

Crystal Reports has a long and interesting history and has become the de facto standard in report writers across all ERP packages. It is very full functioned and works with any database server. It takes a bit to learn how to use it effectively and is a very powerful tool for Business Partners that use it. The main complaint about Crystal is that it’s hard for end users to customize their own reports, due to some of its complexities.

Written by smist08

January 21, 2012 at 4:32 pm

SaaSifying Accpac

with 4 comments

I talked in general terms about types of cloud solutions in my blog post “Accpac in the Cloud”. In this blog post I’m going to talk more specifically about how Accpac currently runs in our AccpacOnline.com environment and the changes we are making to support running the 6.x Web version as a true SaaS solution.

AccpacOnline.com

AccpacOnline.com is our current cloud offering for both Sage ERP Accpac and SageCRM. This site hosts Sage products as well as a number of associated ISV solutions. For the current Accpac we use Citrix to run separate sessions for each client logging in. Clients are load balanced across our servers, but any combination of clients can be running on any given server. There is then a separate cluster of SQL Servers to contain all the various customer databases. Below is a diagram of the AccpacOnline.com infrastructure.

When you access Citrix to run Accpac, you first login to Windows then you run the regular Accpac desktop and get the regular Accpac desktop displaying its sign on screen. In the sign on screen you only get the list of your companies (and no one else’s) plus you can only sign on with user ids that are issued to you. So how does AccpacOnline.com keep all the various users of Accpac straight given that several completely unrelated customers’ users can be running Accpac on the same server?

When you install Accpac you have to specify two directories the program files and the shared data directory. The program files are where all the program executables are stored, all the DLLs, OCXs and EXEs. These programs are the same for all customers of AccpacOnline and are shared. However the shared data directory is kept separate so that all the users for a particular customer get their own shared data directory and hence their own preferences, users and companies. But how do we do this? Since technically inclined users of Accpac will know that the location of the shared data directory is stored in the registry under HKEY_LOCAL_MACHINE which is global to all users. Basically we rely on Windows to do this, we install the program files to the regular location on c:, but then we install the shared data to a g: drive. This drive is a mapped drive that points to a different folder for each AccpacOnline customer. So when a user of that customer signs into Windows through Citrix, a login script then maps the G: drive to the appropriate folder for that customer.

This then gets the user up and running with the correct environment. This isn’t a security measure though; all the various folders are additionally protected by Windows security settings so users of a given customer don’t have any sort of access to data belonging to another customer. So even if they could figure out how to map drives to other peoples folders, it still wouldn’t work for their login id.

An alternative approach used by many vendors is to run a separate virtual server for each customer. Basically this model is managed hosting where rather than have a physical server on-premise; you have a virtual server in the cloud. However going this way is more expensive for a number of reasons. With the AccpacOnline model all the users are sharing the same installation, so for us, updating the program files is easy, we don’t need to login in to hundreds of virtual servers to apply hotfixes or product updates. Plus we are only running one copy of the operating system, so the operating system memory and resources are shared. With the managed hosting model each customer has their own copy of the operating system using memory and resources. This means that with the AccpacOnline model we can run many more users per physical server bring down the cost of the service.  The real game of cloud services is how you can architect it, to bring down the cost per user in order to run a competitive service.

Moving to Accpac 6

With Sage ERP Accpac 6 we are rolling out Accpac as a true Web based application. There are many reasons to do this that have considerable customer value. One of the primary reasons to move Accpac to the web is to improve the AccpacOnline.com experience and to reduce the costs to operate AccpacOnline.com to reduce the overall TCO. Basically we are looking to run a modern Internet SaaS business running Accpac. You will access it through the browser like most other Web based applications.

The key to AccpacOnline.com today is that you first login to Windows, this then establishes which customer you are a user for and sets up the environment for you. The key point is that this is a two-step sign on process, first you sign on to Windows to establish which customer (or tenant) you are and then you sign on to a given Accpac user for that customer. In the web world we will do the same thing. Sign on will be a two-step process where first you sign on using the same credentials you would use to sign on to Windows, this will then establish which customer (tenant) you are and connect then present you with the Accpac sign on screen where you select which company you are currently using (and you can only choose from one of your own companies) and your session date. What is happening behind the scenes will be transparent to the user, they will just see nice Web screens where first they type in their user and password then next choose their company and session date.

In this environment Accpac is running under a Web Server and all customers are sharing the same pool of Web Servers. So we don’t have fancy Windows drive mappings to keep things separate for us. Back when we developed Sage ERP Accpac 5.6A, we knew this was coming, so we changed the Accpac API so that all calls would reflect the given customer (tenant) and retrieve the correct data for them. So most ISVs that develop third party applications should have adapted to this environment as part of supporting the 5.6A View Template. In the on-premise version of Accpac any older APIs are still present for compatibility. However in the AccpacOnline.com version, these APIs will be removed (usually low level functions missing an hSIB or hPIB parameter).

The new Sage ERP Portal that was introduced in 6.0A is implemented as a common component and as a result keeps its data in separate files and databases from the main Accpac application. As a result it underwent a bit of refactoring to separate out this data by customer and to tie into the new multi-step sign on process.

When running our VB UIs under Citrix, multiple customers are sharing the same operating system running as Citrix users on the Windows Server, but each user is running their own copy of the various Acccpac EXEs and DLLs. Basically Citrix and Windows Terminal Services keeps each user quite isolated from each other and each gets their own running programs. This then becomes quite expensive on memory since each process has its own memory space. Although this is far better than virtualizing each customer, we can do quite a bit better. With the new AccpacOnline we will run far fewer processes. We will pool resources and requests will be processed by worker threads for the duration of the run. This then means each user uses far less memory and other server resources allowing more users per server and again reducing the TCO of the AccpacOnline service.

Of course we will have multiple application servers available to handle requests and then load balance these requests across the servers (as we do today with Citrix).

We won’t be combining client databases into one database like Salesforce.com does. Part of what they do is a result of using Oracle as their database where each database requires a database server process running. We will be using SQL Server where we can have multiple databases kept separately within one SQL Server process. Further this allows us to use Accpac’s inherent multi-version support to allow clients to upgrade to new versions when they wish. Most ERP customers want control over version upgrades. NetSuite accomplishes this with separate sets of servers per version, but we can do a bit better than this.

Even though clients can upgrade when they wish, we will be trying to make it as easy as possible to upgrade. We call this frictionless upgrade. We want to reach a state where we can regularly roll out new product updates or even versions where clients are upgraded silently in the night and it just works the next day. We still have a ways to go down this road, but that is the end goal.

Summary

We are really excited about clients running Accpac both as an on-premise installed Web application and as a true hosted Web application. I previously blogged about SaaSifying Sage and a lot of that ties in to how we deliver and operate AccpacOnline.com.

Written by smist08

June 18, 2011 at 5:11 pm

Accpac on the Amazon Cloud

with 10 comments

Introduction

The Amazon Elastic Compute Cloud (EC2) (http://en.wikipedia.org/wiki/Amazon_Elastic_Compute_Cloud) is a service offered by Amazon.com that allows people to rent virtual computers to run applications on. Some of the innovations offered by this solution include:

  • Very easy to get started, you just need an Amazon account, attach it to EC2, and off you go.
  • Very inexpensive, with a good (nearly) free trial (http://aws.amazon.com/ec2/pricing/).
  • Scalable and expandable depending on your needs.

Often the simplicity of getting started with this solution gets lost, since people are usually confronted with the advanced features at the beginning, which you don’t need to worry about until later. Just be re-assured that this is a solution that can grow with you. Below is a diagram of some of the services offered:

In this blog posting we will look at how to deploy Accpac on the Amazon EC2 cloud and discuss some of the trade-offs and choices that can be made along the way.

Terminology

One thing that makes using Amazon EC2 intimidating is the terminology. But here is a quick guide to the key points.

  • Amazon Machine Image (AMI) – These are virtual machine snapshots that you take as a starting point to doing work. Amazon provides a number of these as starting points, there are a number of public ones offered by other people plus you can create your own. Basically when you want a new virtual machine you take one of these as your starting point.
  • Instances – You create an instance from an AMI and the instance is the virtual machine that you actually run. When you specify the instance you specify the resources it has including memory, disk space and computing power. For more on the instance types see: http://aws.amazon.com/ec2/instance-types/.

You manage all these things from the Amazon Management Console:

Deploying Accpac

Deploying Accpac to Amazon EC2 is fairly straight forward. You just need to select a starting virtual image (AMI) of something that Accpac supports, create an instance of that, run the instance, install and configure Accpac into that image and off you go. There are a couple of “gotchas” to watch out for that we will highlight along the way.

  1. Go to http://aws.amazon.com/ec2/ and sign up for an account.
  2. Run the AWS Management Console (https://console.aws.amazon.com/ec2) and create a PKI security key pair. You will need to do this before doing anything else. This will be the security token you use to connect to your virtual image running on EC2.
  3. On the upper left of the management console, make sure it is set to the region that is closest to you like perhaps “US West”.
  4. Click the “Launch Instance” button on the AWS Management Console. You will now be prompted to choose a starting AMI. A good one to choose is: “Getting Started on Microsoft Windows Server 2008” from the Quick Start tab. This one has IIS and SQL Server Express Installed.
  5. Select “Small” for the instance type, unless you know you will need more resources quickly. Then accept the defaults for the advanced instance options. Same for the tags screen (i.e. you probably don’t need any).
  6. On the “create key pair” screen, select the key you created in step 2 (or if you skipped that then you need to create a pair now).
  7. On the configure firewall screen, remove the opening for SQL Server, you don’t need this. The only two holes in the firewall should be RDP and HTTP. If you are hosting client data, then you should add HTTPS and setup Accpac to use that (see https://smist08.wordpress.com/2010/11/20/setting-up-sage-erp-accpac-6-0a-securely/).
  8. Now you can review your settings and Launch your instance. It can take 15 minutes or so for the instance to launch, mostly due to the time it takes Windows Server 2008 to boot. So this is a good time to go get a coffee.

At this point we have created a virtual image and have it running. From the AWS Management Console EC2 dashboard, we should see one running instance. We should also see 1 EBS volume. The EBS volume is the disk image of your virtual image. If you want to you can create snapshots of your EBS volume (you have to pay to store these) so you can go back to them if you mess up your image. So now we have our own Windows 2008 server running in the Amazon cloud, great, but now what do we do? How do we connect to it? How do we transfer files to it? How do we browse to it? What are the Administrator and SQL Server passwords? Now we’ll go through the steps of getting the Administrator Password, connecting via RDP and installing Accpac.

  1. Select the instance that you have running in the management console. From the instance actions menu, choose “Get Windows Admin Password”.  If this doesn’t work, you may need to give the instance a bit more time to start. You will get a dialog that wants you to take the file you downloaded back at step 2, load it into notepad and copy/paste its contents into this dialog. Then this dialog will go off and do a long cryptographic calculation and tell you the Windows Password.
  2. Now you can run Remote Desktop and connect to your instance (if you choose Connect from the instance menu it will download a file that will start RDP with the right parameters). Use the public DNS as the computer name (from the pane with the instance details below the instance list). Administrator is the login. Be careful because copy/pasting the password can be difficult because Windows tends to add a space when you copy the password. If copy/paste doesn’t work, try just typing the password. Now you are logged in and running. Perhaps the first thing you want to do is change the Administrator password to something easier to type and remember. Now you can treat this virtual Windows Server 2008 just like any other remote server.
  3. Copy the installation image for Accpac onto the virtual machine. You can use an FTP site or any other file copy mechanism to do this. On convenient method that Windows 7 has is that RDP can make local drives accessible to the remote computer. If you choose Options – Local Resources you can expose some drives to the remote computer and then they will show up in Windows Explorer there.
  4. Now we need to enable SQL Server, by default the service is disabled and authentication is set to Windows Authentication only. Go to Admin Services – Services and set the SQL Server services to Automatic and start them. In the SQL Server configuration manager enable TCP/IP and set the port to 1433. In the management console set the authentication to SQL Server and authentication, then go to the sa user and enable it. Now restart the SQL Server service. Create your Accpac databases such as PORTAL, SAMSYS, SAMINC, SAMLTD, …
  5. Run the Accpac installation you copied into the image and perform the usual steps to get Accpac up and running. When running database setup, make sure you use localhost as the server name and not the current Windows instance name, because this will change each time you run the image.

We now have Accpac up and running and can access Accpac via RDP. To access the Portal use the public DNS as the server name in the usual URL for running the portal:

http://<public_DNS_goes_here>/SageERPAccpac/portal60a/portal.html

Viola you are running in the cloud.

If you shutdown this instance and restart it, you will get a new computer name and a new public DNS. This can be rather annoying if you like to setup Browser shortcuts and such. If you want to avoid this you need to allocate a static Elastic IP address from the AWS (doing this costs a small amount of money). Then you can associate this IP address with the instance and now it will stick. Further you could purchase a meaningful URL and associate it with this IP address. If you don’t want to purchase a URL, another trick is to use TinyURL.com to generate a URL for your IP address. This isn’t a very meaningful URL but it’s better than the raw IP address.

How Well Does It Run?

Once running, how does it compare to a local server? With the small configuration you are limited a bit in memory. It seems that running the Sage ERP Accpac 6.0A portal on the virtual image in a RDP image is a bit slow.  However running the browser locally and hitting the server remotely is quite quick. This implies that the small image is sufficient for the server processes for a few users. However you will need to increase the memory and/or the processing power for more. The nice thing with Amazon is that you can change this fairly easily and only pay for what you are using. It also shows that the Amazon datacenters have quite good network latency, probably better than you can get hosting yourself for remote users.

Going Production

So can you go production with this? Certainly the platform can support it. The current sticking point is terminal server or Citrix licenses. These are available through various programs such as: http://community.citrix.com/pages/viewpage.action?pageId=141100352. However you need to be part of one of these Microsoft or Citrix programs where they give you specific permission to migrate your licenses to EC2. While we still have Windows Desktop components this is a potential sticking point. However once Sage ERP Accpac 6.1A comes out and we can run all the main accounting applications through the web, then this problem goes away.

Amazon is also addressing other compliance type concerns, for instance achieving PCI DSS Level 1 Compliance (http://aws.amazon.com/security/pci-dss-level-1-compliance-faqs/?ref_=pe_8050_17986660) and ISO 27001 Certification (http://aws.amazon.com/security/iso-27001-certification-faqs/?ref_=pe_8050_17986660). Receiving these sort of certifications remove a lot of obstacles to using Amazon for a production environment.

Also if you want to back up your data locally then you will need to copy a backup of your SQL Server database over the Internet which could be quite time consuming, but you can let it run in the background.

Summary

Amazon’s EC2 Service offers an excellent way to access extra computing resources at a very low cost. You can deploy services to regions around the world and dynamically adjust the computing resources you are using. For developers this is a very cheap way to obtain access to test servers when in development. For partner this is an excellent way to establish demo servers. For education this is an excellent method to learn how to work with different operating systems and to practice installations.

Written by smist08

December 17, 2010 at 9:41 pm