Stephen Smith's Blog

Musings on Machine Learning…

Archive for August 2011

Tables and Data Flow of the Accpac Inventory Control Module – Part 2

with 2 comments

Last week we started to cover the database structure of the Accpac Inventory Control (I/C) module here.

The main types of database tables in I/C can be categorized as:

  • Setup
  • Master Data
  • Data Entry
  • Data Processing
  • Periodic Processing

We covered Setup and Master Data last week. This week we will cover Data Entry, Data Processing and Periodic Processing.

Data Entry

Data Entry consists of:  Receipts, Shipments, Adjustments, Transfers, Assemblies/Disassembles and Internal Usage.

These functions are only used by the I/C forms for entering data into I/C. Although applications can enter documents into these functions and post them, this isn’t the normal way of interacting with I/C. When applications feed data into G/L, A/R, A/P, O/E or P/O they create and post documents just like a user would. So in G/L Journal Entry you can see all the G/L batches created by entry operators plus a large number of batches entered into G/L by other accounting applications. This is a very uniform manner of entry that keeps everything together. However this wouldn’t be practical for I/C, since then there would be huge numbers of I/C Receipts and Shipments for everything that affects inventory and costing levels. Hence applications have protocols to interact with the inventory items directly without creating documents to do it during posting. This whole process is then orchestrated by Day End Processing to keep a unified audit trail so you can accurately track everything that happens.

Receipt Entry

ICREEH (IC0590): Receipt Headers.

ICREED (IC0580): Receipt Details

ICREEDO (IC0585): Receipt Details Optional Fields.

ICREEDS (IC0587): Receipt Detail Serial Numbers.

ICREEDL (IC0582): Receipt Detail Lot Numbers.

ICREEHO (IC0595): Receipt Headers Optional Fields.

ICPSTRE (IC0520): Post Receipts. Superview to post receipts.

ICRCPH (IC0560): Receipt Not Costed Headers. Receipt After Post – uncosted.

ICRCPD (IC0550): Receipt Not Costed Details.

ICRCPDP (IC0555): Receipt Not Costed Detail Optional Fields.

ICRCPDS (IC0558): Receipt Not Costed Detail Serial Numbers.

ICRCPDL (IC0553): Receipt Not Costed Detail Lot Numbers.

ICRCPHP (IC0565): Receipt Not Costed Header Optional Fields.

Shipment Entry

ICSHEH (IC0640): Shipment Header.

ICSHED (IC0630): Shipment Detail.

ICSHEDO (IC0635): Shipment Detail Optional Fields.

ICSHEDL (IC0632): Shipment Detail Lot Numbers.

ICSHEDS (IC0636): Shipment Detail Serial Numbers.

ICSHEHO (IC0645): Shipment Header Optional Fields.

ICPSTSH (IC0530): Post Shipments. Superview to post shipments.

Adjustment Entry

ICADEH (IC0120): Adjustment Headers.

ICADED (IC0110): Adjustment Details.

ICADEDO (IC0115): Adjustment Detail Optional Fields.

ICADEDS (IC0117): Adjustment Detail Serial Numbers.

ICADEDL (IC0113): Adjustment Detail Lot Numbers.

ICADEHO (IC0125): Adjustment Header Optional Fields.

ICPSTAD (IC0500): Post Adjustments. Superview to post adjustments.

Transfers Entry

ICTREH (IC0740): Transfer Header.

ICTRED (IC0730): Transfer Detail.

ICTREDO (IC0735): Transfer Detail Optional Fields.

ICTREDS (IC0738): Transfer Detail Serial Numbers.

ICTREDL (IC0733): Transfer Detail Lot Numbers.

ICTREHO (IC0741): Transfer Header Optional Fields.

ICPSTTR  (IC0540): Post Transfers. Superview to post transfer documents.

Assembly Entry

ICASEN (IC0160): Assemblies.

ICASENO (IC0165): Assembly Optional Fields.

ICASENL (IC0162): Assembly Lot Details.

ICASENS (IC0167): Assembly Serial Details.

ICPSTBM (IC0510): Post Bill of Materials. Superview to post bills of materials.

Internal Usage

ICICEH (IC0288): Internal Usage Headers.

ICICEHO (IC0289): Internal Usage Header Optional Fields.

ICICED (IC0286): Internal Usage Details.

ICICEDO (IC0287): Internal Usage Detail Optional Fields.

ICICEDS (IC0284): Internal  Usage Serial Numbers.

ICICEDA (IC0283): Internal Usage FAS Details.

Data Processing

Day End Processing

For more details on the Day End Processing Process, have a look at this blog post.

How and when Day End Processing is run is heavily controlled by the various options records in the system. For instance whether costing is performed during day end or posting is controlled by I/C options. These settings are there so you can optimize I/C for your business needs and environment.

ICDEND (IC0275): Master Day End Processing.

ICDEP (IC0276): IC Day End Processing

Calls the following superviews and updates the following tables:

ICCSTP (IC0270): Post Item Costs.

ICPOSTR (IC0420): Receipt Posting.

ICITRP (IC0350): Item Receipts Posting.

ICRECPH (IC0570): Receipt Audit List Headers.

ICRECHP (IC0568): Receipt Audit Header Optional Fields.

ICRECPD (IC0575): Receipt Audit List Details.

ICRECDP (IC0567): Receipt Audit Details Optional Fields.

ICPOSTS (IC0430): Shipment Posting.

ICSHIPH (IC0652): Shipment Audit List Headers.

ICSHPHP (IC0657): Shipment Audit Header Optional Fields.

ICSHIPD (IC0650): Shipment Audit List Details.

ICSHPDP (IC0655): Shipment Audit Detail Optional Fields.

ICPOSTA (IC0400): Adjustment Posting .

ICADJP (IC0140): Post Adjustment Details.

ICADJH (IC0132): Adjustment Audit List Headers.

ICADJHP (IC0135): Adjustment Audit List Header Optional Fields.

ICADJD (IC0130): Adjustment Audit List Details.

ICADJDP (IC0131): Adjustment Audit List Detail Optional Fields.

ICPOSTT (IC0440): Transfer Posting.

ICTRANH (IC0716): Transfer Audit List Headers.

ICTRNHP (IC0720): Transfer Audit List Header Optional Fields.

ICTRAND (IC0714): Transfer Audit List Details.

ICTRNDP (IC0718): Transfer Audit List Detail Optional Fields.

ICPOSTB (IC0410): Assembly Posting

ICASSMH (IC0182): Assembly Audit List Headers.

ICASMHP (IC0172): Assembly Audit List Header Optional Fields.

ICASSMD (IC0180): Assembly Audit List Details.

ICASMDP (IC0170): Assembly Audit List Detail Optional Fields.

ICHIST (IC0280): Transaction History.

ICHISTS (IC0285): Transaction History Inquiry summary.

ICSTAT (IC0700): Transaction Statistics.

ICSTATI (IC0710): Sales Statistics.

ICXLHIS (IC0815): Lot Number History.

ICXSHIS (IC0835): Serial Number History.

Periodic Processing

I/C has many processing functions that are run on demand (often infrequently). This section gives a sampling of the main I/C Periodic Processing functions:

ICCLRHI (IC0230): Clear History.

ICCLRIN (IC0240): Delete Inactive Records.

ICPRCC (IC0460): Copy Item Pricing.

ICPRCU (IC0470): Update Item Pricing.

ICWKGEN (IC0760): Generate Inventory Worksheet.

ICWKL (IC0770): Inventory Worksheets.

ICWKLO (IC0775): Worksheet Optional Fields.

ICWKUH (IC0790): Inventory Worksheet Headers.

ICWKUD (IC0780): Inventory Worksheet Details.

ICWKUHO (IC0795): Worksheet Detail Optional Fields.

ICWKUHL (IC0793): Inventory Worksheet Lot Numbers.

ICWKUHS (IC0797): Inventory Worksheet Serial Numbers.

ICPOSTW (IC0450): Inventory Reconciliation Posting.

ICGLTR (IC0278): Generate G/L Batches.

ICINTCK (IC0300): Integrity Checker.

ICUPGD (IC0752): Activation.

ICCTOE (IC0271): Transaction Inquiry for OE.

ICPRJNL (IC0483): Reset Printed Posting Journals.

ICLBLH(IC0365):  Label Headers.

ICDLDN (IC0277): IC Drill Down.

ICADGEN (IC0128): Process Adjustments.

Summary

Hopefully this gives a bit of help in understanding the I/C module for when you are producing custom forms, reports or writing VBA macros.

 

Written by smist08

August 27, 2011 at 7:40 pm

Posted in sage 300

Tagged with , ,

Tables and Data Flow of the Accpac Inventory Control Module – Part 1

with 3 comments

This blog posting will be looking at the structure of some of the parts of the Inventory Control (I/C) module of the Sage ERP Accpac product. You can find the structure of the individual tables in the Accpac Application Object Model (AOM), which must be viewed in IE. However this doesn’t tell you how the tables are related or how data flows from one table to another as you post various transactions. There are 146 tables in I/C, so obviously we can’t describe each in this two part blog post. So we’ll just look at a few interesting cases. Understanding these relationships can be especially important to people writing sophisticated custom reports or BI Views.

The main types of database tables in I/C can be categorized as:

  • Setup
  • Master Data
  • Data Entry
  • Data Processing
  • Periodic Processing

Inventory Control is a fairly big module that you typically don’t see much of. The modules you run are Order Entry and Purchase Order which then remove things from inventory or add things to inventory. Both P/O and O/E make extensive use of I/C business logic in everything they do. I/C owns the Day End Processing task which I blogged about here. Both the Lot Tracking and Serialized Inventory modules actually reside entirely within the I/C module. I/C is responsible for maintaining all the complicated pricing models Accpac supports as well as the various costing models that are supported. I/C tracks the inventory by location as well as via serial and lot numbers. Below is a diagram showing this flow:

Setup

The ICX tables are used for serial and lot tracking. I/C Items like G/L accounts have a structure and several of these are used to define these. The Setup Tables are:

ICOPT  (IC0380): Options.

ICSEG (IC0610): Item Segments.

ICITMS (IC0320): Structure Codes.

ICSEGV (IC0620): Segment  Codes.

ICLOC (IC0370): Locations.

ICCATG (IC0210): Categories.

ICCATTX (IC0220): Category Taxes.

ICACCT (IC0100): Account Sets.

ICBOMH (IC0200): Bill of Materials.

ICBOMD (IC0190): Bill of Material Details

ICPCOD (IC0390): Price List Codes.

ICPCTX (IC0395): Price List Tax Codes

ICXWARY (IC0850): Warranty Codes.

ICXCONT (IC0800): Contract Codes.

ICXMASK (IC0805): Mask Structures used for lot and serial numbers.

Master Data

I/C has quite a lot of master data. Neither O/E nor P/O have much master data because all of theirs is located here in I/C. This includes the main inventory items, along with pricing, location details, reorder quantities, kits, lot numbers and serial numbers.

Items

ICITEM (IC0310): Items.

ICUNIT (IC0750): Units of Measure.

ICITMTX (IC0330): Item Tax Authorities.

ICITMV (IC0340): Vendor Item Number Details.

ICITEMLO (IC0312): Item Lot Optional Fields.

ICITEMO (IC0313): Item Optional Fields.

ICITEMSO (IC0314): Item Serial Optional Fields.

ICITMC (IC0319): Customer Item Numbers.

ICITMB (IC0318): Build Item Number

Item Pricing

ICPRIC (IC0480): Price Lists.

ICPRICP (IC0482): Item Pricing Details.

ICPRTX (IC0490): Price List Tax Authorities.

ICPRICC (IC0481): Pricing Price Check.

Location Details

ICILOC (IC0290):Location Details contains Quantities / Costs.

ICCOST (IC0260): LIFO/FIFO Receipt Costs – Detail records for LIFO/FIFO Qty / Costing.

Using ICILOC can be quite complicated so there are a number of super-views to help with Location Details:

ICLOCI (IC0372): Loc. Quantities (Finder).

ICNCST (IC0375): Transactions Not Costed.

ICILOCG (IC0295): Create Item Location.

Other Master Tables

ICCUPR (IC0274): Contract Pricing.

ICIOTH (IC0305): Manufacturer’s items.

ICKITH (IC0356): Kitting Items.

ICKITD (IC0355): Kitting Item Components.

ICREOR (IC0599): Reorder Quantities.

ICREORD (IC0600): Reorder Quantities Details.

ICREORO (IC0601): Reorder optional fields.

ICXSER (IC0830): Inventory Serial Numbers.

ICXSERO (IC0832): Inventory serial number optional fields.

ICXLOT (IC0810): Inventory Lot Numbers.

ICXLOTO (IC0812): Inventory lot number optional fields.

A Superview to help with LIFO/FIFO is:

ICCOSS (IC0258): LIFO/FIFO Inquiry Summary

Below is a diagram showing how these tables are related:

Often using the I/C Views directly can be quite complicated. One trick is to use the O/E or P/O views instead. For instance figuring out what the price will be for an item for a customer can be quite complicated, but an easier way to do it is to create an order for that customer then insert a detail line to buy one of that item and O/E will fill in all the correct pricing information into that detail line for you (then just don’t post that order).

Summary

Hopefully this blog posting provides a bit more insight into how I/C operates and hopefully helps when you use I/C, interface to I/C or are creating custom I/C reports or forms.

To be continued with data entry, data processing and periodic processing

Written by smist08

August 20, 2011 at 8:21 pm

Posted in sage 300

Tagged with , ,

Accpac and SQL Server

with 6 comments

Introduction

I blogged about Accpac and its databases here.  This was a general article about how Accpac works with the various databases it supports. This article is more specific to Microsoft SQL Server and covers various topics that affect Accpac using SQL Server for its databases. I’ll be specifically talking about SQL Server 2008, but most of what I say applies equally to SQL Server 2005. Generally SQL Server is quite a reliable and low maintenance database server that doesn’t require much in the way of configuration and administration. Accpac uses SQL Server in a fairly straight forward manner issuing standard SQL statements without using SQL Server specific modules. This means Accpac runs on all editions of SQL Server, since we don’t use features only available in higher editions, and it means we can work with most of SQL Server’s built in services and configurations. We store the database in an open manner, following standard practices, which allows you to access your Accpac data using standard SQL aware tools including most reporting solutions. I also blogged on business continuity here, which is related to the topics discussed in this article.

Setup – Collation

One of the first decisions you need to make is a database collation sequence. This defines the character set used in the database as well as the sort order for the database. Generally choose the one that is best for your location and the language you use. However there are a couple of notes to consider:

Most western character set names either end in _BIN or _CI_AS such as Latin1_General_BIN or SQL_Latin1_General_CP1_CI_AS. The _BIN means that this is a binary sort order where strings will be sorted using a computer binary order, the upshot is that A-Z are before a-z and things like accented characters will be at the end. The CI_AS ones are case insensitive and will be sorted in dictionary order meaning AaBb… and accented characters are inserted in with their base letter. A few consequences of this for Accpac are:

  1. If you are migrating from Pervasive.SQL, then since Pervasive uses a binary sort order, if you choose a _BIN sort order, there shouldn’t be any problems. If you choose a _CI_ one in SQL Server then since this is case insensitive, you will get record already exists errors if you load a database that has key fields that only vary by case. Typically this happens in unit of measures ilke Ea, EA, ea. If you want to make this conversion you will need to fix up your Pervasive database before DB-Dumping.
  2. If you chose a _CI_ sort order then since strings are case insensitive, when you do searches, say from the finder, then it doesn’t matter the case you enter, so searching for steve would return Steve, STEVE or steve. This is the main reason people want to use this sort order.
  3. If you want to change the sort order, you will need to DBDump create a new database in the new sort order and DBLoad, however you could get problems with the keys and need to make some manual adjustments based on the errors returned.

Setup – Configuration

At a minimum, you want to have TCP/IP and Shared Memory enabled as protocols. TCP/IP is required for the new Sage ERP Accpac 6.0A Portal, and is the network protocol that works best for the rest of Accpac. Shared Memory is a very high performance protocol that will be used if the program using SQL Server is running on the same computer as the SQL Server service, basically bypassing the whole network stack. The main gotcha here is SQL Server express which will have TCP/IP disabled and will further be configured to not receive external requests (connections tab of the server properties in the management studio). This is configured from the SQL Server Configuration Manager:

If you right click on the TCP/IP protocol, choose properties and then select the second tab, you can see the network port being used by SQL Server.

Database Setup

If you are running multiple instances of SQL Server, then you can enter the servername\instancename in Database Setup, however you can’t do this in the portal configuration. In the portal configuration you must you the server name and the correct port number. Each instance of SQL Server will be using a unique port number.

When Accpac first supported SQL Server we only supported access by a pre-setup ODBC data source. This led to problems with people setting these up incorrectly, since there are quite a few settings that were possible. Then later on we allowed you to connect by just specifying the server name (or servername\instancename) in database setup. Then we would correctly setup an ODBC data source for you (usually servername32).  This then worked far more reliably. We left the ODBC option there so people already setup this way wouldn’t be affected, or if there was some reason that special configuration values were required, you could still set them. I don’t know of any such special configuration values, not do I know of any good reason to use an ODBC data source rather than the server name. So I would strongly recommend you always use the server name in Database Setup to configure your SQL Server companies.

Database Dump and Load

DBDump and DBLoad are two utilities that come with Accpac that allow you to easily dump databases into a database independent format and reload them. These allow you to easily move Accpac companies between locations or between different database servers. I blogged on these previously here. These are very handy utilities and have some interesting side effects like re-indexing and compacting databases (since they completely re-create and reload the database). However they aren’t as fast as the built in SQL Server backup and restore facilities.

64 Bit SQL Server

Although Sage ERP Accpac is a 32-bit application, it works just fine with the 64-bit version of SQL Server. You cannot directly load 32-bit DLLs into a 64-bit application, but this isn’t how SQL Server works. Accpac (or any other program) is never loaded into SQL Server’s process space. All communications is between two processes using a standard well defined protocol, so it doesn’t matter whether either is 32 or 64 bits. Generally you want to run the 64-version of SQL Server on a 64-Bit version of Windows Server 2008. This way you can get better performance by adding more memory (the 32-Bit versions don’t really use memory beyond 4gig).

One caveat is that the ODBC data sources used by Accpac are 32 bit ones and stored in the 32-bit part of the registry. To see them you have to use the 32-bit version of odbcad32.exe which you need to run from the strange location of: C:\Windows\SysWOW64\ODBCAD32.exe.

Network Bandwidth

If you are running in a standard workstation setup client/server configuration then each workstation communicates with SQL Server over the network. For best performance you want as good network connectivity as you can get, but generally a 100 Megabit connection is fine.

If you are running using Citrix or Terminal Server then if the SQL Server is also on the Terminal/Citrix server then it can use shared memory access which provides good performance. However this is tempered by how much memory and processor resources are required by the Accpac Business Logic, UIs and SQL Server. Generally once you get past a few users then it works better to separate the SQL Server over to a separate machine with a high speed connection to the Citrix/Terminal server, ideally having them both on a gigabit backbone. As we move to the Web, a Web Server will replace the Citrix/Terminal server and then the same advice would apply for the Web Server.

Another recommendation I’ve heard is to ensure you only use switches and not hubs and to beware that if you are using VOIP that you aren’t saturating your network bandwidth with phone traffic.

Disk Storage

I know a lot of people that swear by various DAS (Direct Attached Storage), NAS (Network Attached Storage) and SAN (Storage Area Network) solutions. But my own experience has been a bit negative with these. I find them very useful for storing large amounts of files, like virtual images, photos and such. But I find them un-reliable and often with performance problems when using them to store databases being directly accessed by a database server.  There are also now cloud services to have your database in the cloud and your application server on premise, but I tend to be skeptical of the performance and uptime of such services. However these all make great backup devices.

My own preference is to use RAID storage on the server. RAID 0 is the fastest, but doesn’t have redundancy (and so never recommend this). Then the other RAID levels offer various levels of redundancy, so if a drive fails, it can be replaced without losing any data. Have a look at the link: http://en.wikipedia.org/wiki/RAID. The decision of which to go with depends a bit on how much redundancy you want to how much you are willing to pay. RAID 5 is very redundant, but tends to suffer a bit in performance.

A recommendation we often give is to locate your LDF files to RAID 1 storage and your MDF and other IDX files to RAID 1+0 storage.

Even with the reliability of RAID, make sure you still backup your company databases whether mirroring to offsite servers, backing up to rotating external hard drives, DBDumping, etc. Also remember it’s a good idea to test that you can restore your backups. Remember that the value of your company database far exceeds the value of the equipment it is stored on.

Disk Mirroring

SQL Server supports a number of database mirroring options; meaning that you can have your SQL Server stream database changes to a mirror SQL Server. Besides acting as a hot standby in case of failure, you can use this as a reporting server. Often reporting through Crystal can put a fairly big strain on the SQL Server, and one way offload this is to configure users that only do read-only access via reporting to connect to the mirror server. This way a department that runs many reports won’t impact the main data entry that is going on.

Summary

These are just a few tips and tricks for dealing with Accpac and SQL Server. Generally as your database becomes larger and the number of users increases, so does the amount of care you need to take of your database and network infrastructure; but, for smaller installations generally you just need to ensure you take regular backups. Of course if you don’t want to deal with database administration at all, then you might want to consider using a cloud service like AccpacOnline.

Written by smist08

August 13, 2011 at 5:44 pm

Posted in sage 300

Tagged with , ,

Fun with SData

with 24 comments

Introduction

SData is Sage’s new REST based Web Services API used by many of our applications. I’ve blogged about SData a few times already: SData in Sage ERP Accpac 6, More on SData and Sage ERP Accpac 6 and Stateful SData. Calling something a RESTful Web Services API sounds pretty technical and difficult; however, this blog post will try to show that using SData isn’t all that hard and that perhaps it provides and easier to use access method than using SQL or other APIs.

In fact, you might think that you need a complicated SDK with a giant reference manual to use SData. In fact you don’t need an SDK at all. Since SData is based on web standards, many tools already know how to deal with SData. In previous blog posts I showed how SData queries were just URLs that you can just enter in a browser like Chrome and then see the XML response in the browser window. Not very friendly. In this article we’ll show how you can use standard open source tools to play with SData and then how even commercial tools like Microsoft Excel have the capability to deal with SData since Excel knows how to interpret web standards. All the documentation for SData is also freely availalble at http://sdata.sage.com/.

cURL

cURL is an open source command line utility and library. You can use the command line utility to execute SData commands to retrieve data and to insert/update/delete records. cURL is a handy utility to anything you like with things that use the HTTP protocol including requests to web sites and web services.

For instance you can issue the command:

curl –user “ADMIN:ADMIN” http://localhost/SDataServlet/sdata/sageERP/accpac/SAMINC/Customers

which will return all the customer records from the Accpac SAMINC database. The –user parameter is used to set the login, note that these must be in upper case for Accpac. Change localhost with your own server name and replace SAMINC with the company id that you want to use.

You can read a specific record by specifying the index as in the following example that reads good old Ronald Black:

curl –user “ADMIN:ADMIN” http://localhost/SDataServlet/sdata/sageERP/accpac/SAMINC/Customers(‘1200’)

Try playing with the query language specified at http://interop.sage.com/daisy/sdata/Queries/Filtering.html to retrieve just the records you want. For instance:

curl –user “ADMIN:ADMIN” http://localhost/SDataServlet/sdata/sageERP/accpac/SAMINC/Customers?where=IDCUST gt ‘1500’

to get customers with codes greater than 1500.

You can also insert records with cURL, for instance issue the command:

curl –user “ADMIN:ADMIN” http://localhost/SDataServlet/sdata/sageERP/accpac/SAMINC/Customers -X POST -T cust.xml

where cust.xml contains:

<?xml version=”1.0″ ?>
<entry xmlns:sdata=”http://schemas.sage.com/sdata/2008/1
xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance
xmlns=”http://www.w3.org/2005/Atom”&gt;
<id>Customer</id>
<title>Some title</title>
<content>Content</content>
<sdata:payload>
<customer>
<IDCUST>STEVE</IDCUST>
<NAMECUST>Stephen Smith</NAMECUST>
<IDGRP>RTL</IDGRP>
<IDACCTSET>USA</IDACCTSET>
<CODETAXGRP>CALIF</CODETAXGRP>
<IDBILLCYCL>BILLWK</IDBILLCYCL>
<TEXTSTRE1>865 W. 14th Ave.</TEXTSTRE1>
<NAMECITY>Vancouver</NAMECITY>
<CODESTTE>B.C.</CODESTTE>
<CODEPSTL>V5Z 1R2</CODEPSTL>
<CODECTRY>Canada</CODECTRY>
<NAMECTAC>Stephen Smith</NAMECTAC>
<TEXTPHON1>(604) 555-1234</TEXTPHON1>
</customer>
</sdata:payload>
</entry>

Note: With these examples, if you try them and have problems, check the double quotes, Word and blogging software tends to change regular double quotes to the slanted ones which then doesn’t work in XML. Also before user its intended to be two dashes not a long dash.

SData from Excel

Excel has a very good ability to query web sites and then parse the data returned. Excel knows about most web standards including the Atom feeds used by SData. You do this by creating a data connect from the web and specify the SData URL as indicated in the dialog below:

It will prompt you for a user id and passwork, enter your Accpac credentials (all in upper case). There are a lot of extra fields and URLs returned in the worksheet that is produced. But in the middle is all the data that you want from Accpac.

Now you can create a more user friendly worksheet based on this one. You can refresh the data anytime you like and you can automate the collection with a VBA macro. So in a new worksheet you can create charts linked to this data, do calculations on it, or manipulate it any other way you like using Excel.

Finding SData URLs

So how do you know what are the SData feed available from an application? You can run Fiddler to spy on the HTTP traffic to see what requests are being made (similar to other Accpac Spy programs).

But SData is actually fairly self-documenting. For instance if you want to get the list of all SData resources for a given dataset then enter:

curl –user “ADMIN:ADMIN” http://localhost/SDataServlet/sdata/sageERP/accpac/SAMINC

We call this a shortened URL and it returns a feed of all the SData resources that you can use at the next level of the URL. In the same way if you enter:

curl –user “ADMIN:ADMIN” http://localhost/SDataServlet/sdata/sageERP/accpac

it will return to you a feed that contains a list of all the Accpac datasets (companies) on that server.

If you want to get information on a feed you can use a $schema call, so if you issue:

curl –user “ADMIN:ADMIN” http://localhost/SDataServlet/sdata/sageERP/accpac/SAMINC/Customers/$schema

you will get back a description of all the fields in the feed.

Summary

Many people view SData as a developer technology for ISVs to integrate with Sage products. But I think for partners and integrators that are comfortable performing customizations and integrations using SQL, that SData provides another rich approach to consider. I think that especially for newer graduates entering careers installing, customizing and maintaining ERP systems, that URLs and REST based Web Services will be the much more natural way to go.

Written by smist08

August 6, 2011 at 5:47 pm

Posted in sage 300

Tagged with , , ,