Stephen Smith's Blog

All things Sage ERP…

Posts Tagged ‘Sage

Sage 300 ERP 2016 SQL Server Only

with 15 comments

Introduction

If you were able to attend the Sage 300 ERP roadmap sessions at Sage Summit you would have heard that the next major release of Sage 300 ERP (named 2016 but released in 2015) will be dropping support for Pervasive.SQL and Oracle as database servers. This means the only supported database will be Microsoft SQL Server. Although we will support several versions of SQL Server long with the Azure SQL flavor.

The intent of this article is to help make sure everyone has plenty of advanced warning about this change. To help explain the rationale behind this decision, and to help people formulate migration plan if you aren’t already running SQL Server.

Pervasive.SQL

PervasiveLogo

The first Windows version of Sage 300 ERP (then called CA-Accpac/2000) was released supporting one database which was good old Btrieve 6.15. We all have fond memories of those days when the world was much simpler, we just needed a simple robust database manager without any other real concerns. At that time we had a good bundling deal with Btrieve so we could include a database engine with every System Manager. At that time Btrieve was owned by Novell. At that point in time Btrieve was a good low cost database manager that supported transactioning, it was used by many ERPs, and was relatively easy to install and administer. Novell sold off Btrieve back to its original developers and that evolved into Pervasive.SQL and last year that was acquired by Actian.

Pervasive.SQL still has the same qualities that Btrieve 6.15 had, but it hasn’t really kept up with its competitors. SQL Server now has a free edition and Microsoft is much more favorable to doing bundling deals. Plus there are now many better low cost database alternatives such as SQLLite and MySQL.

Over that past years the higher end databases have become much easier to install and manage. Long gone are all the configurable parameters that would plague SQL Server installations (i.e. the defaults now work for most cases). So now Pervasive.SQL isn’t as easy to use.

Anyway Btrieve was the first database that Sage 300 ERP supported, and I think a lot of people have fond memories of Btrieve, but today it doesn’t seem to have a place anymore.

A lot of Sage 300 ERP installations require integrations to many other products, and nearly none of these support Pervasive.SQL. Hence if you want integration with BI tools, or other ERP related software, you are almost always forced to use SQL Server anyway. In the early days of Sage 300, SQL Server was very expensive and most products supported Btrieve as a low cost alternative, but today the need for that has disappeared and we are one of the last vendors to still be supporting Pervasive.SQL.

Oracle

Oracle1

We’ve had Oracle support for a while now. However the sales numbers have never really justified the resources required to support this platform. Oracle tends to be the database of choice for larger companies that tend to be bigger than Sage 300’s target market. We’ve made a few government and large company sales because we support Oracle, but generally these customers would have been as well served by SQL Server.

Our perspective is that the demand for Oracle has waned and that they are really pursuing larger and larger companies and moving further and further away from our market segment.

Multiple Product Integrations

Most Sage 300 ERP sites these days involve multiple products working together such as Sage CRM and Sage HRMS. Generally people only want to work with one database system and the common one across the various products is SQL Server. Most products support a choice of databases, like Sage CRM supports Oracle and SQL Server and Sage HRMS supports FoxPro and SQL Server. To get a more uniform experience across all these products really only works well if you choose SQL Server. It’s generally nicer to have just one set up database operations for things like backup.

Further when you start to use more advanced cross product reporting tools, these can only do their job if all the products are based on the same database engine (so that SQL joins can work properly, etc.).

Database Architecture

The Sage 300 ERP architecture is still the same and supports multiple databases, whether we support another database than SQL Server in the future will depend on the future of the database market. It might be a lighter weight SQL engine like SQLLite is best. Or one of the new NoSQL databases that are becoming popular like HBase or MongoDB. Certainly the NoSQL databases support capabilities that SQL Server can only dream of. Similarly products like SQLLite also run on all the mobile an alternate operating systems opening up all sorts of other possibilities. Chances are these will be introduced in a hybrid manner combined with SQL Server rather than as solutions that handle 100% of our system’s needs.

sql-server-logo

For the short term we will be concentrating on SQL Server which means can use some features that are more specific to SQL Server. However one of our keys to success has been sticking to the core SQL engine functionality. That we work fine with SQL Express and Azure SQL (unlike a number of competitors). So we will be careful to ensure anything we do doesn’t break our database independence or our flexibility in supporting all flavors of SQL Server.

Moving to SQL

If you are running an unsupported database and want to move to Sage 300 ERP 2016 then you will need to convert the database. To convert from an unsupported database like Pervasive.SQL, DB2 or Oracle, you need to run Database Dump on your databases, create SQL databases for these in SQL Management Studio, create the entries in Database Setup and then run Database Load. Make sure that you update and test your backup/restore and disaster recovery plans to ensure that you are still protected.

The conversion must be done before upgrading, since the 2016 version doesn’t include the unsupported database drivers and can’t access these databases and hence can’t do a Database Dump on them.

If you leave Pervasive, DB2 or Oracle databases in Database Setup then these won’t show up in any sign on dialogs. We’ve changed the message when you run the desktop, so that if you don’t have any databases because they are unsupported, why this is the case and to let you run Database Setup.

If you don’t want to switch to SQL Server, it just means the last version you can upgrade to is Sage 300 ERP 2014. This will be supported for its normal lifecycle. When it goes out of support, of course your software will still operate. But you won’t be able to get any new Service Packs or Hotfixes. This should present a quite large window on when to switch. These days, nearly all new sales are SQL Server and the number of SQL installs is the largest share and of course every one already running SQL Server won’t be affected.

Summary

The database world is changing and Sage 300 ERP needs to change with it. That’s why we are making these changes. We hope that converting your Pervasive or Oracle databases to SQL Server won’t be too painful and that you will get quite a few long term benefits from this move.

 

Written by smist08

September 2, 2014 at 12:23 am

Sage Intelligence Go!

with 4 comments

Introduction

A new flavor of Sage Intelligence was demonstrated by Himanshu Palsule during his keynote at Sage Summit called Sage Intelligence Go! (SIG). I thought I’d spend a bit of time this week providing a few more details about what it is and where it fits into the scheme of things.

Sage Intelligence is a business intelligence/reporting tool used by many Sage ERP products. For those who have been around the Sage 300 ERP community, they will recognize it as Alchemex which Sage purchased a few years ago. This is a product that runs as a Microsoft Excel add-in which extracts data from the ERP to be manipulated by the power of the Excel spreadsheet engine. This is a very popular way of doing Financial Reporting since you often want to present the data graphically or you want to perform complex calculations or you want to slice and dice the data using pivot tables. Excel provides a great platform for performing these tasks. The original Financial Reporter bundled with Sage 300 ERP works in a similar manner as an Excel Add-in. Sage Intelligence is used for quite a few things beyond Financial Accounting including Sales Analysis and other predictive type BI functions.

Sage Intelligence has been around for a while and is a good mature product. Since it is written with the full Excel add-in SDK, it must run in Excel and specifically the locally installed version of Excel. This means it isn’t particularly well suited for cloud applications such as Office Online.

Microsoft has now released a newer SDK for developing Office apps. This new SDK is designed to allow you to develop applications that can still run in the locally installed full Microsoft Excel, but they can also run inside the cloud/web based Excel Online as well as the new specialty versions of Office like the one for the iPad.

As Sage ERP applications move to the cloud, you would want to have your Financial Reporter be cloud based as well. You would want to be able to edit Financial Reports in the browser as well as display and interact with reports on any device including your tablet or smart phone. This is where Sage Intelligence Go! comes in. It is written in the new Office Apps SDK and will run on the online and device version s of Excel as well as the full locally installed Excel. This you don’t need a Windows PC at all to use your cloud based ERP and cloud based Financial Reporter. However you still have all the power of Excel helping you visualize and manipulate your reports.

sig1

Sage One Accounting

Sage One Accounting by Sage Pastel is such a cloud based ERP system. This product has Sage Intelligence Go! as an option for performing various reporting needs. When SIG started out here, it was much simpler. The original Office Apps SDK was very simple and didn’t nearly have the power of the various older SDKs supported by local Excel. However as time as gone by, the Office App SDK has become much stronger and the functionality is now much more in line with what we expect of such a solution.

Office Apps

For those of you who attended Sage Summit and saw James Whittaker’s keynote, they would have seen the importance Microsoft is placing on the new Office Apps. Basically to switch from using the browser and search, or using Apps from a mobile device App store, you will get your data directly in your business applications via this new sort of App. If you go to the Office App Store, and search on Sage you will find a Sage One app for Outlook and Sage Intelligence Go! If you search a bit more broadly, you won’t find that many non-trivial applications. Sage Intelligence Go! is probably the most sophisticated Office App in the store.

Notice that you don’t run SIG reports from the ERP application, they run from Excel. Once you have the XLSX file, you can just run it anytime from any version of Excel and have full access to your data. This is really a new paradigm for doing reporting. For SIG this works really well. Whether other applications fit this model is yet to be seen.

Web Services

SIG needs to communicate both with the Office API (whether cloud or local) as well as with the ERP to get the data to process. Both of these functions are accomplished via RESTful web services. The communication with the ERP must be via Web Services since these could originate from the SIG Office App running in the Microsoft Office Online Cloud or from the Office App running in the local Excel. It isn’t possible to use the traditional methods of database integration like ODBC when the two may be running in completely different locations.

Basically the Sage Cloud based ERP exposes a standard set of Web Services that are protected by Sage ID. When the user starts SIG, they get a Sage ID login prompt from with-in Excel and then this is transmitted through to the Sage Cloud ERP which uses the Sage ID to look up which tenants this user runs as. This information is related back to SIG in case it needs a second prompt to act which tenant (company) to access.

Sage Intelligence Go consists of a server component that runs in the cloud as well as the Office App that runs in Excel. The Server portion of SIG uses the provided Web Services to load the data to be reported on from the ERP into its in-memory database for processing. This in-memory database is maintained in the cloud where the SIG service runs. Then the Office App part of SIG interacts with the server side to present the correct required data and perform various processing functions.

sig2

ERP Integration

This solution requires the ERP provide Web Services that are exposed to the Internet in general, so that it can be access via Excel running anywhere whether installed locally, running on an iPad or running as a Web Application in the cloud. This means these Web Services need to be secure and available 24×7 with very good availability. For this reason you will see SIG first integrating to Sage Cloud based ERPs (like the current Sage One Accounting by Sage Pastel) and later via the Sage Data Cloud which will offer these services on behalf of on-premised installed ERP systems.

Summary

Sage Intelligence Go! is our solution for cloud reporting. Its primary purpose is to provide Financial Reporting capabilities, but it is also capable of handling other BI type reporting needs. This is our solution to moving a lot of reporting functions into the cloud.

sig3

 

 

SEOS

with 2 comments

Introduction

Sage ERP Online Services (SEOS) was a system developed alongside the online/cloud versions of Sage 200 ERP (UK) and Sage Morano ERP (Spain). The purpose of this was to provide a number of common sets of functionalities for all Sage online applications such as creating new tenants and users, managing audit logs, managing cloud credentials and managing cloud resources. The system also handled common functions such as integration to the various Sage billing systems. Although SEOS was originally developed as part of the Sage 200 and Sage Morano online projects, it was always intended to be a general tool used by all Sage cloud products. SEOS has been in successful production with both Sage 200 and Sage Morano for some time now, it is now being adopted by various other Sage cloud products both in Europe and in North America.

A lot of what SEOS does is behind the scenes and customers and partners generally don’t see it as a separate service. But I thought it might be of interest to people to know what is or will be going on behind the scenes in our various cloud products. This article is just talking about SEOS in general and doesn’t make any claims as to which will be the next product to adopt SEOS or any timelines for such adoption. But for people using Sage 200 or Sage Morano, they probably already have some experience using the SEOS portal.

One of the key goals of SEOS is to automate everything so there are no manual processes in creating new tenants or users, but additionally it orchestrates disaster recovery, backup/restore and the auto-scaling of resources.

SEOS Services

SEOS has a number of different ways to interact with it. There is a web portal for DevOps/IS type people who run the system, there is a web portal for customers and partners and then there are a number of APIs for the various ERPs to communicate with. Some of the main components are shown below.

seos1

The main job an ERP has to do to integrate to SEOS is provide a provisioning engine. This engine runs all the time (actually at least two of them for high availability). This engine checks the SEOS job queue to see if it has anything to do. Tasks it might be asked to perform include creating a new tenant, creating a new user or suspending a tenant.

Bootstrapping the System

To create a new cloud instance of an ERP that is integrated with SEOS just requires the provisioning engine running. This is either setup manually or via some PowerShell scripts. Once the provisioning engine is going, everything else is done in the provisioning engine as it gets messages from SEOS. When it gets its first create tenant message, it will see that no application servers are running and create a couple (for high availability), it will create the database and do anything else that is needed for that tenant to be able to operate. For the first tenant that could mean creating quite a few storage, database, VM or other resources.

Then as tenants are added, the provisioning engine will be monitoring (with the help of SEOS) the usage of the system and will start additional resources (like application servers), if it determines that the system needs to scale to handle the additional load.

Thus nearly all the work of creating a large cloud system, possibly consisting of hundreds of VMs will all be brought about automatically via the SEOS/provisioning engine system.

This then helps with disaster recover, since when SEOS switches to the alternate data center, it handles switching the databases and basically bootstraps the whole process the same way. Note that databases are a bit different since they are already being replicated to the alternate site and you just want to switch the backup to primary and then use that.

Multi-Cloud

When the provisioning engine requires a new resource like an Azure SQL database or a new Web Role (VM), it doesn’t just go to the cloud API to get it (say using the Azure SDK). Instead it asks SEOS for the resource and SEOS creates it for the ERP. This way the ERP isn’t using the native cloud API, instead it just uses the SEOS API. This then opens up the possibility for hosting these cloud ERPs in different clouds.

Currently all the SMB ERPs are hosted in the Microsoft Azure cloud because we get very good pricing on this and it meets our needs extremely well. However we don’t want to put all our eggs in one basket and if conditions change dramatically, we can much more easily switch other providers. There are other reasons we may need to do this, for instance Azure doesn’t have a data center in Africa and we have a lot of customers in Africa, so we may need a provider closer than Singapore.

DevOps

DevOps is the group that runs all the various Sage Cloud offerings (its official name varies from region to region, but the idea is the same). Having DevOps manage dozens of cloud products all working different ways with different maintenance procedures would be a huge challenge. SEOS brings all these aspects together into one set of procedures.

Take logging for example. It’s easy for any application to generate huge log files for diagnostic and auditing purposes. There are lots of good APIs for doing this. But managing these logs is a challenge. The logs need to be archived for future reference. Generally there are several types of logs, like the Windows Event Log, and application diagnostic log and an application security log. All these need to be kept in a central spot including backing them up. There has to be easy ways to search them, say by tenant, event type or date time. Many people use Elastic Search to search their logs. SEOS provides a uniform way for managing these and automates most of the process. This way DevOps only needs to know one way of managing logs and not a separate way for each ERP. Plus SEOS automates the whole process, avoiding manual procedures and mistakes.

Billing Engines

Sage is a large company and operates around the world. Most of our products are charged for and the backend systems that do billing vary around the world. Various geographic regions have extra regulatory requirements and different business rules. SEOS handles providing usage data to the billing engine via a standard adapter. Each region has to write an adapter to interface to SEOS. Removing this burden of interfacing to the Sage billing systems is a huge benefit to the ERP teams who really don’t want to have to deal with these details.

Summary

If you are a partner or customer using Sage SMB Cloud offerings in Europe, then you have probably already seen SEOS and know something about it. If you are using Sage SMB Cloud offerings in other parts of the world, then you will probably start to see SEOS appearing over the next while. These will probably appear more as features or services, but these are being brought to you by the SEOS global initiative.

Written by smist08

August 9, 2014 at 3:34 pm

Sage Summit 2014

with 3 comments

Introduction

I’m just back from Sage Summit 2014 which was held at the Mandalay Bay Resort/Hotel in Las Vegas, Nevada. There were over 5200 attendees at the show, a new record for Sage. The Mandalay Bay is a huge complex and I racked up a record number of steps for GCC getting from one place to another. Las Vegas is easy to get to for most people since there are a lot of direct flights from around North America and you can find really cheap hotel accommodation near to the conference (like $29 at the Excalibur which is connected to Mandalay Bay by a free tram). The only down side to having he conference in Vegas is that smoking is still allowed in many public places, which is really annoying.

The conference had a great many guest speakers including quite a few celebrities like Magic Johnson and Jessica Alba. The convention trade show wasn’t just booths, there were also open speaking theatres that always had something interesting going on as well as the Sage Innovation Lab Exhibit.

There were a great many product breakout sessions as well as a large number of breakout sessions on general business and technology topics. The intent was to make Sage Summit a place to come to for a lot more than just learning new technical details about your Sage product, or promoting new add-ons for you to purchase. A lot of customers attending the show told me that it was these general sessions on accounting, marketing and technology that they found the most useful.

The show was huge and this blog post just covers a few areas that I was directly involved in or attended.

Great General Sessions

Besides the mandatory Sage keynotes, there were quite a few general sessions which were quite amazing. My favorite was Brad Smith’s interview with Biz Stone the co-founder of Twitter and Jelly. Biz certainly provides a lot of interesting context to Web startups, as well as a lot of interesting stories of why he left Google and chose the path he chose. It was certainly interesting in the way a lot of the successful founders left very secure lucrative careers to really struggle for years to get their dreams off the ground. A common theme was the need for persistence so you could survive long enough to eventually get that big break. Another common theme was to follow people and ideas rather than companies and money. Now I’m going to have to read Biz’s book: “Things a Little Bird Told Me: Confessions of the Creative Mind”.

bizstone

Another very popular session was the panel discussion with Magic Johnson, CEO of Magic Johnson Enterprises, Jessica Alba, co-founder of the Honest Company and J. Carrey Smith, CEO of Big Ass Solutions. This discussion concentrated on their current businesses and didn’t delve into their celebrity pasts for which at least two panelists are rather well known for. There were a lot of good business tips given and it was interesting to see how Magic Johnson and Jessica Alba have adapted what they did before to becoming quite successful CEOs.

magicalba

Sage’s Technology Vision

A lot of Sage’s technology and product presentations were about our mobile and cloud technology vision. The theme was to aggressively move into these areas with purposeful innovation that still protect the investment that our customers have in our current technologies. At the heart of this vision is the Sage Data Cloud. This acts as a hub which mobile solutions can connect to as well as a way that data can be accessed in our existing products whether in the cloud or installed on premise. Below is the architectural block diagram showing the main components of this.

sagetech2

This is perhaps a bit theoretical, but we already have products in the market that are filling in key components of this vision. Some of these are included in the next diagram.

sagetech1

We use the term “hybrid cloud” quite a bit, this indicates that you can have some of your data on premise and some of your data in the cloud. There are quite a few use cases that people desire. Not everyone is sold with trusting all their data to a cloud vendor for safe keeping. In some industries and countries there are tight regulatory controls on where your data can legally be located. The Hybrid Cloud box in the diagram includes Sage 50 ERP (US and Canadian), Sage 100 ERP and Sage 300 ERP.

To effectively operate mobile and web solutions, you do need to have your data available 24×7 with a very high degree of uptime and a very high degree of security. Most small or mid-sized business customers cannot afford sufficient IT resources to maintain this for their own data center. One solution to this problem is to synchronize a subset of your on premise ERP/CRM data to the Sage Data Cloud and then have your mobile solutions accessing this. Then it becomes Sage’s responsibility to maintain the uptime, 24×7 support and apply all the necessary security procedures to keep the data safe.

Another attraction for ISVs is integrate their product to the Sage Data Cloud and then let the Sage Data Cloud handle all the details of integrating to the many Sage ERP products. This way they only need to write one integration rather than separate integrations for Sage 50 ERP, Sage 100 ERP, Sage 300 ERP, Sage 300 CRE, etc.

We had a lot of coverage of the Sage 300 Online offering which has been live for a while now. This was introduced last Summit and now offers Sage 300 ERP customers the choice of installing on premise or running in the Azure cloud. Running in the cloud saves you having to back up your data, perform updates or maintain servers or operating systems. This way you can just run Sage 300 and let Sage handle the details. Of course you can get a copy of your data anytime you want and even move between on premise and the cloud.

The Sage Innovation Lab

On the trade show we had a special section for the Sage Innovation Lab. Here you could play with Google Glasses, Pebble Watches, 3D Printers and all sorts of neat toys to see some prototypes and experiments that Sage is working on with these. We don’t know if these will all be productized, but it’s cool to get a feel for how the future might begin to look like.

Summary

This really was Sage Summit re-imagined. There were a great many sessions, keynotes and demonstrations on all sorts of topics of interest to businesses. This should be taken even further for next year’s Sage Summit which will be in New Orleans, LA on July 27-30, 2015. Does anyone else remember all those great CA-World’s in New Orleans back in the 90s?

Written by smist08

August 2, 2014 at 8:07 pm

Using the Sage 300 .Net Helper APIs

with 8 comments

Introduction

The main purpose of our .Net API is to access our Business Logic Views which I’ve blogged on in these articles:

An Introduction to the Sage 300 ERP .Net API
Starting to Program the Sage 300 ERP Views in .Net
Composing Views in the Sage 300 ERP .Net API
Using the Sage 300 ERP View Protocols with .Net
Using Browse Filters in the Sage 300 ERP .Net API
Using the Sage 300 .Net API from ASP.Net MVC
Error Reporting in Sage 300 ERP
Sage 300 ERP Metadata
Sage 300 ERP Optional Fields

However there are a number of simple things that you need to do repeatedly which would be a bit of a pain to use the Views every time to do these. So in our .Net API we provide a number of APIs that give you efficient quick mechanisms to access things like company, fiscal calendar and currency information.

With Sage Summit 2014 just a few weeks away (you can still register here), I can’t pre-empt any of the big announcements here in my blog (as much as I’d like to), so perhaps a bit of an easier .Net article instead. For many these examples are fairly simple, but I’m always getting requests for source code, and I happen to have a test program that exercises these APIs that I can provide as an examples. This program was to help ensure and debug these APIs for our 64 Bit/Unicode version which might indicate why it tends to print rather a strange selection of fields from some of the classes.

Sample Program

The sample program for this article is a simple WinForms application that uses the Sage 300 ERP API to get various information from these helper classes and then populates a multi-line edit control with the information gathered. The code is the dotnetsample (folder or zip) located in the folder on Google Drive at this URL. The code is hard coded to access SAMLTD with ADMIN/ADMIN as the user id and password. You may need to change this in the session.open to match what you have installed/configured on your local system. I’ve been building and running this using Visual Studio 2013 with the latest SPs and the latest .Net.

dotnetsample

The Session Class

The session class is the starting point for everything else. Besides opening the session get establishing the DBLink, you can use this class to get some useful version information as well as some information about the user like their language.

The DBLink Class

From the session you get a DBLink object that is then your connection to the database and everything in it. From this object you can open any of our Business Logic Views and do any processing that you like. Similarly you can also get quick access to currency and fiscal calendar information from here. Of course you could do much of this by opening various Common Service Views, but this would involve quite a few calls. Additionally the helper APIs provide some caching and calculation support.

The Company Class

Accessing the company property from the DBLink object is your quick shortcut to the various Company options information stored in Common Services in the CS0001 View. This is where you get things like the home currency, number of fiscal periods, whether the company is multi-currency or get address type information. Generally you will need something from here in pretty much anything you do.

The FiscalCalendar Class

You can get a FiscalCalendar object from the FiscalCalendar property of the DBLink. In accounting fiscal periods are very important, since everything is eventually recorded in the General Ledger in a specific fiscal year/fiscal period. G/L mostly doesn’t care about exact dates, but really cares about the fiscal year and period. For accurate accounting you always have to be very careful that you are putting things in the correct fiscal year and periods. In Common Services we setup our financial years and fiscal periods assigning them various starting and ending dates. Corporate fiscal years don’t have to correspond to a calendar year and usually don’t. For instance the Sage fiscal year starts on October 1, and ends on September 30.

This object then gives you methods and properties to get the starting and ending dates for fiscal periods, years or quarters. Further it helps you calculate which fiscal year/period a particular date falls in. Often all these calculations are done for you by the Views, but if you are entering things directly into G/L these can be quite useful. Some of the parameters to these methods are a bit cryptic, so perhaps the sample program will help with anyone having troubles.

The Currency Classes

There are several classes for dealing with currencies, there are the Currency, CurrencyTable and CurrencyRate classes. You get these from the DBLink’s GetCurrency, GetCurrencyTable and GetCurrencyRate methods. There is also a GetCurrencyRateTypeDescription method to get the description for a given Currency Rate Type.

The Currency object contains information for a given currency like the description, number of decimals and decimal separator. Combined with the Currency Rate Type, we have a Currency Table entry for each Currency Code and Currency Rate Type. Then for each of these there are multiple CurrencyRate’s for each Currency on a given date.

So if you want to do some custom currency processing for some reason, then these are very useful objects to use. The sample program for this article has lots of examples of using all of these.

Remember to always test your programs against a multi-currency database. A common bug is to do all your testing against SAMINC and then have your program fail at a customer site who is running multi-currency. Similarly it helps to test with a home currency like Japanese Yen that doesn’t have two decimal places.

Summary

This was just a quick article to talk about some of the useful helper functions in our Sage 300 ERP .Net API that help you access various system data quickly. You can perform any of these functions through the Business Logic Views, but since these are used so frequently, they save a lot of programming time.

Evicting Users from Sage 300 ERP

with 5 comments

Introduction

Generally Sage 300 ERP is used in a multi-user environment where users could be distributed across a large building or located in many different sites. Further Sage 300 ERP uses a concurrent licensing model for users, so if you have 10 Lanpaks then 10 people can login at once; however, it doesn’t matter which ten people it is.

Often companies save a bit of money by buying fewer Lanpaks than users of the product. Perhaps a clerk works the early shift of 7-3 and then when they go home a Financial Accountant runs some Financial Reports. But what happens if that clerk doesn’t sign off? What if they work at home and aren’t answering their phone? Now the Financial Accountant gets a message that all the Lanpaks are in use and can’t get their work done.

Evicting Users

To solve this problem Sage 300 ERP 2014 Product Update 2 will be introducing an Evict Users feature. Previously we provided a detailed list of everyone in the system and what they are doing which I blogged on here.  Now you can also kick them out of the system to recover the Lanpak for someone else to use.

From the Current Users screen there is now a push button to “Sign Out Selected Users”. You then get a dialog with a dire warning and are requested to enter the admin password and confirm to kick out the desired user.

evict

Then in a minute or so, all the screens for that user will be terminated and their Lanpaks will be available for someone else to use.

Technical Details

So how is this accomplished? Basically when you evict a user, that screen will store an encrypted file in the shared data folder. Periodically any Lanpak Managers running will have a look to see if there is a new file and if there is it will see if it is for users they are managing. If so, Lanpak manager will kill the processes of all the screens they are running. The file is left in place for a few minutes, so this particular user won’t be able to sign in again immediately.

This is a fairly simple scheme that is fairly effective for recovering Lanpaks. It works both for regularly run screens from the desktop as well as web deployed screens from the Web Desktop or from Sage CRM.

Since it’s by user, you can kill the ADMIN user which will kill yourself. If all your users sign in as ADMIN then it will kill all the users on the system. So beware that if multiple users share a user id, then they will all be killed and not just one workstations.

Performing Upgrades

Another use case that this method only partially addresses is kicking everyone out of the system so you can perform an upgrade (like a Product Update). Generally a DLL or EXE file in Windows is locked if a running process uses it. Hence you can’t update Sage 300 if someone has a4wapi.dll in use for instance. It could be that this method does gets everyone out of the system, but there are a few cases which may not work:

  • If someone signs off the Sage 300 Desktop but leaves it running. In this case the EXE and DLLs are still in use, but since it isn’t using a Lanpak and isn’t associated with a user, it can’t be killed this way. However I tend to think this is fairly unlikely.
  • It won’t be effective against things that quickly open sessions, do something and then close the session. This would include things like the Sage CRM integration where the custom Sage CRM pages open a session load their data and close the session. However things like this tend to be Web Servers and can usually be stopped remotely or at least from a central place.
  • Things that we allow to use Sage 300 without using a Lanpak. This would include things like parts of the Sage CRM integration, the Sage HRMS integration and the Sage 300 Portal.
  • For third party products, if they are full SDK, it will definitely work on them. If they aren’t full SDK it may or may not work depending on how they are built.

Keep in mind that the main purpose of this feature is to manage Lanpaks, performing upgrades is just a secondary things, where hopefully it helps, but may not be a complete solution.

The Scary Warning

The warning message when you run this is fairly severe. Part of this is because we are killing the processes that are connected to our API. Generally you won’t corrupt the main database because everything is protected by transactioning. So if you do kill the process while they are say posting a batch, it just means the transaction in process will be rolled back and they will have to do it again later. Generally the expectation is that this feature is used once people go home and aren’t doing anything which would be harmless. Further in the user list screen you can see what people are doing, so don’t kill the person running Day End.

However if someone is using the UI and they are resizing columns in a grid, you could catch things at the wrong time and corrupt a *_p.ism file. But these can be deleted and sometimes repaired with ScanIsam. If you are running a non-SDK third party product, I can’t really say what will happen if it’s killed.

Summary

The Evict Users feature is the number one feature as voted on, on our Ideas web site and this is now in the product. So keep making suggestions to https://www11.v1ideas.com/Sage300ERP/Accpac and voting on suggestions that are in the system that you would like to see implemented.

This features will make it easier for companies to manage their Lanpaks and get better value from the system. Hopefully this will also make managing upgrades a bit easier as well.

Written by smist08

May 24, 2014 at 4:17 pm

Some Thoughts on Security

with 2 comments

Introduction

With the recent Heartbleed security exploit in the OpenSSL library a lot of attention has been focused on how vulnerable our computer systems have become to data theft. With so much data travelling the Internet as well as travelling wireless networks, this has brought home the importance of how secure these systems are. With a general direction towards an Internet of Things this makes all our devices whether our fridge or our car possibly susceptible to hackers.

I’ll talk about Heartbleed a bit later, but first perhaps a bit of history with my experiences with secure computing environments.

Physical Isolation

My last co-op work term was at DRDC Atlantic in Dartmouth, Nova Scotia. In order to maintain security they had a special mainframe for handling classified data and to perform classified processing. This computer was located inside a bank vault along with all its disk drives and tape units. It was only turned on after the door was sealed and it was completely cut off from the outside world. Technicians were responsible for monitoring the vault from the outside to ensure that there was absolutely no leakage of RF radiation when classified processing was in progress.

After graduation from University my first job was with Epic Data. One of the projects I worked on was a security system for a General Dynamics fighter aircraft design facility. This entire building was built as a giant Faraday cage. The entrances weren’t sealed, but you had to travel through a twisty corridor to enter the building to ensure there was not line for radio waves to pass out. Then surrounding the building was a large protected parking lot where only authorized cars were allowed in.

Generally these facilities didn’t believe you could secure connections with the outside world. If such a connection existed, no matter how good the encryption and security measures, a hacker could penetrate it. The hackers they were worried about weren’t just bored teenagers living in their parent’s basements, but well trained and financed hackers working for foreign governments. Something like the Russian or Chinese version of the NSA.

Van Eck Phreaking

A lot of attention goes to securing Internet connections. But historically data has been stolen through other means. Van Eck Phreaking is a technique to listen to the RF radiation from a CRT or LCD monitor and to reconstruct the image from that radiation. Using this sort of technique a van parked on the street with sensitive antenna equipment can reconstruct what is being viewed on your monitor. This is even though you are using a wired connection from your computer to the monitor. In this case how updated your software is or how secure your cryptography is just doesn’t matter.

Everything is Wireless

It seems that every now and then politicians forget that cell phones are really just radios and that anyone with the right sort of radio receiver can listen in. This seems to lead to a scandal in BC politics every couple of years. This is really just a reminder that unless something is specifically marked as using some sort of secure connection or cryptography, it probably doesn’t. And then if it doesn’t anyone can listen in.

It might seem that most communications are secure now a days. Even Google search switches to always use https which is a very secure encrypted channel to keep all your search terms a secret between yourself and Google.

But think about all the other communication channels going on. If you use a wireless mouse or a wireless keyboard, then these are really just short range radios. Is this communications encrypted and secure? Similarly if you use a wireless monitor, then it’s even easier to eavesdrop on than using Van Eck.

What about your Wi-Fi network? Is that secure? Or is all non-https traffic easy to eavesdrop on? People are getting better and better at hacking into Wi-Fi networks.

In your car if you are using your cell phone via blue tooth, is this another place where eavesdropping can occur?

Heartbleed

Heartbleed is an interesting bug in the OpenSSL library that’s caused a lot of concern recently. The following XKCD cartoon gives a good explanation of how a bug in validating an input parameter caused the problem of leaking a lot of data to the web.

heartbleed_explanation

At the first level, any program that receives input from untrusted sources (i.e. random people out on the Internet) should very carefully and thoroughly valid any input. Here you can tell it what to reply and the length of the reply. If you give a length much longer than what was given then it leaks whatever random contents of memory were located here.

At the second level, this is an API design flaw, that there should never have been such a function with such parameters that could be abused thus.

At the third level, what allows this to go bad is a performance optimization that was put in the OpenSSL library to provide faster buffer management. Before this performance enhancement, this bug would just have caused an application fault. This would have been bad, but been easy to detect and wouldn’t have leaked any data. At worst it would have perhaps allowed some short lived denial of service attacks.

Mostly exploiting this security hole just returns the attacker with a bunch of random garbage. The trick is to automate the attack to repeatedly try it on thousands of places until by fluke you find something valuable, perhaps a private digital key or perhaps a password.

password-heartbleed-thumb-v1-620x411

Complacency

The open source community makes the claim that open source code is safer because anyone can review the source code and find bugs. So people are invited to do this to OpenSSL. I think Heartbleed shows that security researcher became complacent and weren’t examining this code closely enough.

The code that caused the bug was checked in by a trusted coder, and was code reviewed by someone knowledgeable. Mistakes happen, but for something like this, perhaps there was a bit too much trust. I think it was an honest mistake and not deliberate sabotage by hackers or the NSA. The source code change logs give a pretty good audit of what happened and why.

Should I Panic?

In spite of what some reporters are saying, this isn’t the worst security problem that has surfaced. The holy grail of hackers is to find a way to root computers (take them over with full administrator privileges). This attack just has a small chance of providing something to help on this way and isn’t a full exploit in its own right. Bugs in Java, IE, SQL Server and Flash have all allowed hackers to take over peoples computers. Some didn’t require anything else, some just required tricking the user into browsing a bad web site. Similarly e-mail or flash drive viruses have caused far more havoc than this particular problem. Another really on-going security weakness is caused by government regulations restricting the strength of encryption or forcing the disclosure of keys, these measures do little to help the government, but they really make the lives of hackers easier. I also think that e-mail borne viruses have wreaked much more havoc than Heartbleed is likely to. But I suspect the biggest source of identity theft is from data recovered from stolen laptops and other devices.

Another aspect is the idea that we should be like gazelle’s and rely on the herd to protect us. If we are in a herd of 100 and a lion comes along to eat one of us then there is only a 1/1000 chance that it will be me.

This attack does highlight the importance of some good security practices. Such as changing important passwords regularly (every few months) and using sufficiently complex or long passwords.

All that being said, nearly every website makes you sign in. For web sites that I don’t care about I just use a simple password and if someone discovers it, I don’t really care. For other sites like personal banking I take much more care. For sites like Facebook I take medium care. Generally don’t provide accurate personal information to sites that don’t need it, if they insist on your birthday, enter it a few days off, if they want a phone number then make one up. That way if the site is compromised then they just get a bunch of inaccurate data on you. Most sites ask way too many things. Resist answering these or answer them inaccurately. Also avoid overly nosey surveys, they may be private and anonymous, unless hacked.

The good thing about this exploit, seems to be that it was discovered and fixed mostly before it could be exploited. I haven’t seen real cases of damage being done. Some sites (like the Canadian Revenue Services) are trying to blame Heartbleed for unrelated security lapses.

Generally the problems that you hear about are the ones that you don’t need to worry so much about. But again it is a safe practice to use this as a reminder to change your passwords and minimize the amount of personally identifiable data out there. After all dealing with things like identity theft can be pretty annoying. And this also help with the problems that the black hat hackers know about and are using, but haven’t been discovered yet.

Summary

You always need to be vigilant about security. However it doesn’t help to be overly paranoid. Follow good on-line practices and you should be fine. The diversity of computer systems out there helps, not all are affected and those that are, are good about notifying those that have been affected. Generally a little paranoia and good sense can go a long way on-line.

Written by smist08

April 26, 2014 at 6:51 pm

Follow

Get every new post delivered to your Inbox.

Join 262 other followers