Stephen Smith's Blog

Musings on Machine Learning…

Archive for November 2013

Using the Sage 300 .Net API from ASP.Net MVC

with 8 comments


In this blog posting we are going to switch from using WinForms projects to an ASP.Net MVC project. ASP.Net MVC is Microsoft’s newest Web Development Platform. Be careful not to confuse ASP.Net MVC with ASP.Net. Both are still available and have similar version numbers, both are still part of Visual Studio, but they are quite different. The main problem with ASP.Net is that it doesn’t handle HTML5 web pages with lots of JavaScript well. Generally if you write a lot of JavaScript, the ASP.Net framework becomes pretty much useless. Microsoft then answered these complaints with ASP.Net MVC which is a more modern framework designed to build modern interactive web sites.


I’ve now installed Visual Studio 2013, so all the projects going forwards are going to be stored in that format. As a result I created the sample application that goes with this article in Visual Studio 2013 using ASP.Net MVC version 5.

MVC versus WinForms

I’ve been using WinForms for the samples in this series of articles on the Sage 300 .Net API because WinForms makes it really easy to write simple programs. I just create a simple project, throw some controls on the form, double click on the control and write some code to do something. Very simple and easy. But for the end result I just get a Windows application that needs to be installed on a given computer along with the .Net framework. ASP.Net MVC is for writing Web applications and it is much more complicated to get started. But the end result is a scalable web application that can be running on a farm of web servers and the UI runs in any browser on the clients computer with nothing else installed except the browser. I can produce UIs just as rich as with WinForms, but now they need to be specified in HTML5/CSS and there are a number of additional capabilities that they get as a result. With WinForms I can create global variables (or leave objects instantiated for the duration of the program), and keep state easily and pass data around freely. With ASP.Net MVC each operation causes new objects to be created and nothing is remembered from call to call (unless I do something special). This way a farm of application servers could be handling the requests each working independently. With WinForms there is no particular structure to the programs I produced, In ASP.Net MVC we are dealing with a very definite MVC structure.


Model – View – Controller (MVC) is a design pattern for creating more robust and more maintainable user interfaces programs. It separates the concerns into three parts to produce a more modular design.


I’m not going to go too much into the details of creating a program and setting these up, since there are dozens of really good tutorials on the Internet like this one. Rather let’s look at our example and talk about the various elements in a real setting.

Sample Program

For a sample program I ported the first WinForms application to ASP.Net MVC. This one is the ARInvEntry project. It basically puts up a web form where you enter the customer number and then hit the “Create Invoice” button and it creates an A/R Invoice for that customer. The main goal here is to get a starting project with a Razor View, a controller and then a Model that calls into the Sage 300 Business Logic using our .Net API.

Note that since Sage 300 ERP is a 32 Bit application, you must run this project as a 32 Bit application, which means if you run this through IIS, you must have the application pool used set to have 32 Bit mode enabled (probably the default application pool).

In this program the main web page is a razor view and it is connected to the model which allows the framework to provide data binding of the customer number into the model, so we don’t need to write any code to move the data from the browser into the model’s member variable. Inside Index.cshtml there is the statement:

@model ARInvEntry.Models.CreateInvoice


Which connects the View to the Model. Then the form is defined by:

        @using (Html.BeginForm("CreateInvoice", "Home")) {
            Customer Number:
            @Html.TextBoxFor(model => model.CustomerNumber)
            <input type="Submit" value="Create Invoice" />

Notice that this code is a combination of HTML like <p> as well as the template helper functions like @Html.TextBoxFor (whose argument binds it to the model’s member variable). The Html.BeginForm command connects the “Create Invoice” button to the controller’s action method. “Home” is the name of the controller (which was generated by Visual Studio) and its code is in HomeController.cs. Then the first argument “CreateInvoice” is the method to call when the form is submitted (ie the button is pressed). All the @ tag template functions will be converted to pure HTML by the razor view engine when the page needs to be generated.

It may not look like there is a lot of glue holding things together here, that is because ASP.Net does a lot of things by convention, and so if you don’t specify something explicitly, it has a convention to find it. So generally if things have the same (or similar) names then they go together. This makes coding faster since you just follow some coding conventions and then all the attendant glue is provided automatically and you don’t need to worry about it. This convention over coding approach takes a bit of getting used to, but once you are, it makes development go quite a bit quicker.

In the controller (HomeController.cs) there is the action method which gets called when the user hits submit:

        public ActionResult CreateInvoice(Models.CreateInvoice crInvObj)
            return RedirectToAction("Index");          

Basically this is called with the model as a parameter where the data bound members will already be filled in. So all it has to do is call the method in the model to create the invoice.

It then returns redirecting us back to the main page.

The model has a public member CustomerNumber:

        public string CustomerNumber { get; set; }


which is used by the data binding support and populated automatically from the browser. Then there is the DoCreateInvoice method that contains the code from the previous project to create the invoice.

Limitations and Future Work

At this point, this is a very basic MVC project which just really establishes our framework for future work. There is no error reporting (except for writing to the VS output window). After you hit submit, we do the work and then return a complete page. We aren’t doing the searching we supported in the WinForms application.

For the error reporting we can’t just popup a message box from the Model when an error occurs. Generally all UI communications needs to be communicated from the model back to a View. We could add JavaScript to the View to get the response from the model to display a message box (JavaScript alert). We could add a new razor view for displaying error and status messages. We could provide a message area on the main form to fill in. We’ll look at implementing some of these in future articles.

Right now when the invoice is created, we simply refresh the home screen. This isn’t particularly efficient or elegant. Partly this is because of how the submit button is implemented as a form submit. What we would really like to do is to submit an Ajax web service request when the button is pressed and then just update select parts of the screen based on the reply.

If you choose to deploy this project to Azure as a PaaS application, you will run into problems due to the dependency on the Sage 300 .Net API which requires at least workstation setup be installed so we will need to talk about how we deploy a project of this nature to the cloud.


This article presents a very simple ASP.Net MVC application who’s Model communicates with Sage 300 via our .Net API. Now that we have this starting point, we can start developing some new elements to the project and explore various aspects of writing programs with ASP.Net MVC.

Written by smist08

November 23, 2013 at 9:40 pm

The Sage Visions and Sage Connect 2013 Conferences

with 5 comments


I’ve just returned from attending first the Sage Visions 2013 conference in Ho Chi Minh City, Vietnam and then a couple of weeks later the Sage Connect 2013 conference in Sydney Australia. In this blog posting I thought I’d provide a few highlights from these conferences and a few points of interest from my trips.

Ho Chi Minh City

Ho Chi Minh City is a city of 9 million people and 5.5 million motorbikes. There aren’t many traffic lights and traffic tends to be a continuous stream of motorcycles. Crossing the street is an adventure, since traffic never stops and there are never any breaks in the traffic. You have to just start crossing the street walking slowly and steadily, then all the motorbikes just flow around you. This is a bit un-nerving at first, but by the time you are ready to leave, you start to get used to it.

It’s interesting to visit all the historical sites that were famous from the Vietnam war like the Presidential Palace, the CIA headquarters (now an apartment building) and the Chu Chi Tunnel Complex. Generally Ho Chi Minh City is quite in-expensive to visit and has some interesting attractions, it is a very vibrant and bustling city that feels it is moving in the right direction. They are building a subway system that is half finished and are starting to build a bullet train from Ho Chi Minh City to Hanoi.

Below is a picture of yours truly with Ho Chi Minh in front of city hall.



I’ve visited Sydney a few times now, so I find it a relaxing place to visit since I know my way around and I’ve already seen most of the tourist attractions. The Connect conference was held in the InterContinental Hotel which is about a block from the Circular Quay and the Sydney Opera House. Sydney has some great restaurants and things to do. I like jogging along the seawall past the Royal Botanical Gardens and swimming in the Andrew Charlton Pool.

Below is a picture I took out of a helicopter on a tour of the harbor.



In both conferences we presented the current roadmap for Sage 300 ERP. As always with forward looking slides, they are subject to change. The dates in this version are for North America. The release of the various cloud products varies a bit from region to region. For instance in Australia they have had Sage Inventory Advisor for over a year and are just introducing Sage Payment Processing.


The roadmap shows the four main streams that we are working on, namely the Sage 300 ERP core product, the forthcoming Sage 300 Online 1.0, all the connected services and then the Sage 300 Online 2.0.

As we are moving to operate more as a cloud company, we will introducing features into the market quicker. For the Sage 300 Online we will be deploying new features into the cloud as they are ready. Then bundling them up for product updates to release for on premise customers. The arrows for the product updates are just meant to be frequent rather than commitments to specific dates.

Lots of Feedback

As always, it’s great to get lots of feedback on the product from all the partners and customers. I always bring this all back to the R&D and Product Management teams in Vancouver. But remember, everyone can provide feedback at any time using our feedback website at: At this site you can see all the suggestions we implemented and those we plan to implement. You can also vote on the suggestions and see which suggestions are getting the most votes.

Both R&D and Product Management watch this list and choose items to implement from the top of the list. So the best way to get changes implemented is to enter them here and get your friends to vote for them.

The Cloud

It was fun demo’ing our various cloud products including Sage 300 Online, Sage Mobile SalesSage Billing and Payments and Sage Mobile Service. I only managed to demo one of these at each conference, but they were all well received. The main question being when they would be available in these regions.

There was a lot of interest in Sage 300 Online since a lot of customers are looking for ways to avoid managing their own servers, especially in regions that can be hit by typhoons and other natural disasters. Having someone else install new versions and update data for you is also a big draw. Fortunately there are Microsoft Azure data centers in Singapore, Sydney, Hong Kong and Melbourne to serve these markets.

Economic Outlooks

At the Sage Visions conference the top awards were taken by Thai business partners as growth in Thailand has been on an upswing after a period of suffering the global economic slump, political problems and a major flood. Then the other South Eastern Asian countries are a bit of a mixed bag with some doing ok and some still suffering.

In Australia, they are suffering a bit from a high Australian dollar, but the housing and construction industries seem to be well into a recovery. Partners appear to have had a good year and are hopeful that the global economic downturn is behind them. Their growth is currently running at 2.6% and interest rates seem lower than previously. Their big worry is what is happening in China and what demand will be for their natural resources.


Attending international Sage conferences is always fun. I enjoy meeting up with all the partners in a region and meeting with a number of customers. Providing new information on Sage 300 and getting feedback on the various concerns in these regions. Plus the great opportunity to play corporate tourist and see some parts of the world that I wouldn’t otherwise get a chance to visit (otherwise known as playing in the Amazing Race, Corporate Edition).

Written by smist08

November 16, 2013 at 8:58 pm

Using Browse Filters in the Sage 300 ERP .Net API

with 3 comments


So far as we’ve explored the Sage 300 ERP .Net API we’ve either read rather specific records via their key segments or we’ve browsed through all the records in a table. But more generally applications are interested in processing through subsets of the entire table in an efficient manner. For instance to get all invoices for a given customer or all G/L Accounts for a specific region. In this article we are going to look at some of the mechanisms that we provide to do this.

Generally you browse through a set of records by calling the View’s Browse method to set a filter on the records and then you call the various methods like Fetch, GoTop, GoNext, etc. to iterate through the records. There are also the FilterSelect and FilterFetch routines to iterate through a selection of records. First we’ll talk about the actual filters, then we’ll return to the methods in the API that use them.

Browse Filters

Filters are strings that are passed as parameters to various methods in the API to restrict Views according to various criteria. This would be used when a user enters selection criteria to locate data.

The format of the string expression is:

expression ::= [(…] condition [)…] [Boolean-operator [(…] condition [)…]…]


  • condition ::= field-name relational-operator operand
  • Boolean-operator ::= AND | OR
  • operand ::= <field-name | constant>
  • relational-operator ::= > | < | = | <= | >= | != | LIKE

An example of an expression is:


where LASTNAME and AGE are field names within the View.

Brackets are allowed. Expressions are evaluated strictly from left to right unless brackets are put in. Thus, in the previous example, the expression is evaluated as:


This is true for SMITHs under 30 years of age, and for any person over the age of 40. Put in brackets if you want the other order. For example:


returns only SMITHs, under 30 years of age or over 40.

All the relational operators work with all the field types in the expected way, except for Boolean fields, where only the = and != operators apply.

Note that both operands can be fields. If they are both fields, they must be the same type. When the expression is parsed, if the second operand is a constant, it is converted to the type of the first operand (which is always a field).

The filter mechanism handles all the field types, but since filters are strings, you have to know how to represent the various values as strings. Boolean fields are always TRUE or FALSE. Date fields are always formatted as YYYYMMDD. Time fields are formatted as HHMMSSHH (the last HH is hundredths of a second).

The use of white space around operators, field names and constants is necessary. If a constant is a string and contains white space then it must be enclosed in quotes. The quote character (“) can be embedded by using \” (no other replacements are performed).

The LIKE operator is like the = operator, except the second operand can contain the wild cards % and _, where:

  • % matches any group of characters.
  • _ matches any single character.

For example, LASTNAME LIKE “SM%” is true for any LASTNAME that begins with SM. LIKE can only be used with string field types.


The internal mechanisms of our API will optimize the processing of filter strings to either convert them to SQL where clauses when processing the requests via a SQL database or will process them based on choosing the best key segment and traversing that key segment if it needs to do record based processing. Note that you should still pay attention of indexes when using SQL, SQL databases aren’t magical and do require good indexes when filtering on large tables, or the query could take quite a long time.

Using Filters

Sage 300 version 1.0A only ran on Btrieve as a database and the only methods that used filters were Browse and Fetch. You can still use these today and they are very useful methods. As the API evolved and we added more and more SQL support we added to these basic methods to help with performance and to make the API easier to use. The Fetch method behave very much like the Btrieve API behaves. The consequence of this is that it will sometimes do a get greater than equal to and sometimes do a get next. It will do the get GE (or get LE for going backwards) after a browse call or after one of the key segments has its value set. For UI programmers this was annoying since they are often changing the direction frequently and had to handle the case when Fetch would return the current record. To simplify this we added GoTop, GoNext, GoPrev and GoBottom. GoTop will get the first record, GoBottom the last and GoNext will always get the next record, never the current record, similarly for GoPrev. Generally I find these new methods far easier to use, and tend not to use Fetch anymore. Note that these Go methods are part of the COM and .Net APIs and are not part of the regular View API and so may not be available to other API sets, internally the .Net API translates these to the correct Fetch calls to do the job.

To aid in SQL performance we added FilterSelect which will actually issue a SQL query to retrieve a results set with the indicated records. Then FilterFetch browses through these records. For Pervasive.SQL these methods work under the covers just like Browse/Fetch. But note that you can’t update records that you retrieve this way. If you want to update such a record you need to call Read first. This is due to multi-user issues as well as the semantics of how SQL result sets work.

There are FilterCount which will return a count of the records that match the given filter and FilterDelete that delete’s the records that match the filter. Beware that these two functions will translate into SQL statements for SQL databases and will execute as fast as the database server can process them (which might not be fast if there aren’t indexes to exploit), and that Pervasive will iterate through the records to perform the operation. Typically you call FilterDelete only on header or flat type Views, and leave the header to call FilterDelete on the details for you.

Sample Program

In the sample program we add a search box where you can type a string and then the program fills the list box with any records where either customer number or customer name contains the string you typed. The sample does the search first using Browse/GoTop/GoNext and then does the same loop using FilterSelect/FilterFetch. Note that this search isn’t scalable, since the database will need to search the entire customer table looking for these substrings. But often SQL Server can do these sort of searches reasonably quickly as long as the data isn’t too big. Note that adding an index won’t help because we are looking for substrings. The more scalable way to do this would be via full text indexing or via a google like search tool.

Boolean gotOne;
string searchFilter =
     "IDCUST LIKE %" + SearchBox.Text + "% OR NAMECUST LIKE %"
     + SearchBox.Text + "%";

// Fill up the list box using Browse/GoTop/GoNext
arCus.Browse( searchFilter, true );

gotOne = arCus.GoTop();
while (gotOne)
         arCus.Fields.FieldByName("IDCUST").Value + " " +
         arCus.Fields.FieldByName("NAMECUST").Value );
    gotOne = arCus.GoNext();

// Go through the records using FilterSelect/FilterBrowse but
 // just print them to the console.
arCus.FilterSelect(searchFilter, true, 0, ViewFilterOrigin.FromStart);
while (arCus.FilterFetch(false))
     Console.WriteLine(arCus.Fields.FieldByName("IDCUST").Value +
          " " + arCus.Fields.FieldByName("NAMECUST").Value);


This was a quick introduction to Browse Filters in the Sage 300 ERP API. These form the basis for selecting sets of records and we will be using these extensively in future articles.


Written by smist08

November 9, 2013 at 1:10 am

Using the Sage 300 ERP View Protocols with .Net

with 32 comments


Last week we looked at opening and composing groups of Views that will work together to accomplish a task. Now we will start to look at how to use the View API via .Net to accomplish common tasks. For Views that work alone this is fairly straight forwards and is just a matter of knowing which methods to call and in which order. For using multiple Views together it is a bit more complicated, but follows a standard pattern and once you get the hang of the pattern it is fairly straight forward also.

This article is just looking at the general algorithms rather than the details. We do provide one concrete example in the sample code. In future articles we will be using these protocols extensively so you will see more concrete examples. Some terms like read and browse/fetch are a bit vague and generally in the API there are several ways to do these which we will look at in future articles. Also some of the mentioned methods have parameters that we will look into in later articles as well.

You can skip any step marked as optional, and sometimes you can skip some steps in simpler situations, it isn’t only till you get to more complicated situations when the need for these steps become more evident. Part of using these algorithms or protocols is to ensure good multi-user throughput, so although another order of steps might work, it might lead to poor performance. So these tend to be a combinations of best practices along with some absolutely required parts.

If you’ve worked with Sage 300 ERP for any length of time, you might know we have an Init method which used to do the work of RecordCreate and RecordClear. We’ve deprecated this method and I’ve tried to use all the new up to date ways of doing things in this article. But if you do see an error in any of these articles, just leave a comment and I’ll fix it.

View Classes

Basically there are six classes of Views where all the Views in the same class use the same protocol for their basic operations. Here we will go through each class quickly with quick algorithms of how to accomplish various standard CRUD operations.

The classes of views are:

  • Flat
  • Ordered header/detail
  • Sequenced header/detail
  • Batch/header/detail
  • Processing
  • Detail

Flat Views

Flat Views are Views that do not need to be composed with any other Views. The key for a flat view may have multiple segments. Most setup Views are flat. Most master Views can be used as flat Views if you aren’t changing the details. Generally you will use these protocols quite a lot.

Insertion protocol

  1. RecordClear the view to initialize the record.
  2. Set values to the fields, including the key fields.
  3. Insert the record.
  4. If there are more records to insert, go to step 1.

Deletion protocol

  1. Set key values to the key fields.
  2. Read or Browse/Fetch the record.
  3. Delete the record.
  4. If there are more records to delete, go to step 1 to Read another record, or go to step 2 to Fetch the next record to delete. (Optional.)

Update protocol

  1. Set key values to the key fields.
  2. Read or Browse/Fetch the record.
  3. Set values to the fields to be changed.
  4. Update the record.
  5. If there are more records to update, go to step 1 or 2 depending on whether Read or Browse/Fetch is used. (Optional.)

Ordered Header/Detail Views

This is the protocol for two Views composed together where one is the header and the other is the detail. The details are kept in key order (as opposed to position order). You won’t see that many Views of this type, typically these are the header/detail Views used in setup forms.

The detail View’s primary key always starts with the primary key of the header View and then adds its own key segments. This then establishes which details belong to the header. Whenever a key field in the header is set, it is automatically also set in the detail View. When you Browse/Fetch through the detail records, you only get the details for the current header View.

Insertion protocol

  1. RecordClear header to initialize the fields.
  2. Set the fields in the header.
  3. RecordClear detail to initialize the fields.
  4. Set the fields in the detail.
  5. Insert detail.
  6. Go to step 3 if there are more details.
  7. Insert header. (This will Post the details)

Update protocol (Include deleting details)

  1. Set header key into header view.
  2. Read or Browse/Fetch the header view to get to the header.
  3. Set the fields to be updated in the header view. (Optional.)
  4. Set detail key into the detail view.
  5. Read or Browse/Fetch the detail view to get to the detail.

The first n segments (where n is the number of segments in the header key) of the detail key will have been Put by the header because of the composition.

  1. Set the fields to be updated in the detail view. (Optional.)
  2. Update or Delete the detail.
  3. Go to step 4 to update another detail.
  4. Update header. (This will Post the details)

Protocol for deleting header

  1. Set the header key in the header view.
  2. Read or Browse/Fetch the header. This causes the audit stamp to be read.
  3. Delete the header. (This will cause all details in the header to be deleted.)

Protocol for browsing

  1. Set the header key in the header view.
  2. Read or Browse/Fetch the header.
  3. Get fields from the header view.
  4. Browse/Fetch the detail view. (Browse/Fetch will not go beyond the header key.)

Sequenced Header/Detail Views

This class of Views is the most common for document entry in the operations modules. The detail View has all the key segments from the header View and then adds one numeric type field to act as the sequence number. The last segment of the header key is also numeric and contains the header number. Note that these numbers can be stored in string type fields, in which case they are masked to only contain the decimal digits ‘0’ to ‘9’. Often you can either set the header key or you can get the header to generate a key which means it allocates the next header number. The header key segments are always automatically set in the detail view and when you browse through the detail records you only get the records for the current header.

Insertion protocol

  1. If the next header number is generated by the view, Use RecordCreate to generate the next available header number. If the header number is specified by the caller, use RecordClear to initialize the fields of the header, then Set the header number in the header view.
  2. Set the fields in the header.
  3. RecordClear detail to initialize the fields.
  4. Set zero into the detail number field to insert from the start.
  5. Set values in the other detail fields.
  6. Insert detail.
  7. Go to step 5 until no more detail.
  8. Insert the header. (This will do a Post of the details.)

Note that an insert of the details does an “insert after”. So because we don’t reset the sequence number in the detail each insert will be after the previous one. If we want to insert after a specific record then put that sequence number into the key field. A common bug is to set the key field to 0 each time which then causes the records to be inserted in reverse order since each new one is inserted at the beginning.

Protocol for update (Include deleting details)

  1. Set header key into header view.
  2. Read or Browse/Fetch the header view to get to the header.
  3. Set the fields to be updated in the header view. (Optional.)
  4. Set detail key into the detail view.
  5. Read or Browse/Fetch the detail view to get to the detail. The header number in the detail view will have been set by the header.
  6. Set the fields to be updated in the detail view. (Optional.)
  7. Update or delete the detail.
  8. Go to step 4 to process another detail.
  9. Update the header. (This will do a Post of the details.)

Protocol for deleting header

  1. Set the header key in the header view.
  2. Read or Browse/Fetch the header. This causes the audit stamp to be read.
  3. Delete the header. (This causes all details in the header to be deleted.)

Protocol for browsing

  1. Set the header key in the header view.
  2. Read or Browse/Fetch the header.
  3. Get fields from the header view.
  4. Browse/Fetch the detail view. (This will not go beyond the header key.)

Batch/Header/Detail Views

This class of Views contain a batch View, header View and detail View. These are used for document entry in the Financial Modules. The header/detail part is just the sequenced header/detail class indicated above and works pretty much the same way. In a similar manner the Batch View has some key segments, one of which is the batch number which is a numeric type (or string where only ‘0’ through ‘9’ are allowed). Then the header shares the batch number and adds its header number and then the detail shares the header’s keys adding its detail sequence number. The batch number is usually generated and rarely can be specified for a new batch. Again the key fields from a higher level View are automatically set to the lower level Views so when you browser headers, you only see the headers for the current batch and when you browse details you only see details for the current header.

Insertion protocol

  1. RecordCreate batch to get the next available key for the batch.
  2. Set the fields in the batch.
  3. Update batch. (The batch record is already inserted by the RecordCreate call, in step 1.)
  4. RecordCreate header to get the next available key for the header. The key for the batch in the header view is assumed to be set already.
  5. Set the fields in the header.
  6. RecordCreate detail to initialize the fields.
  7. Set zero (or last detail number) into the detail key field to insert from the start.
  8. Set values in the other detail fields.
  9. Insert detail.
  10. Go to step 6 until no more detail.
  11. Insert header. (This will do a Post of the details.)
  12. Go back to step 4 to add another header.

Protocol for update (Include deleting details)

  1. Set batch key into the batch view.
  2. Read the batch record.
  3. Set the fields to be updated in the batch view. (Optional.)
  4. Update batch view. (Optional.)
  5. Set header key into header view.
  6. Read or Browse/Fetch the header view to get to the header. The key for the batch in the header view is assumed to be set already.
  7. Set the fields to be updated in the header view. (Optional.)
  8. Set detail key into the detail view.
  9. Read or Browse/Fetch the detail view to get to the detail. The key for the batch and the header in the detail view are assumed to be set already.
  10. Put the fields to be updated in the detail view. (Optional.)
  11. Update or Delete the detail.
  12. Go to step 8 to update another detail.
  13. Update the header. (This will do a Post of the details.)
  14. Go to step 5 to update another header.

Protocol for deleting batch

  1. Set the batch key in the batch view.
  2. Read or Browse/Fetch the batch. This causes the audit stamp to be read.
  3. Delete the batch. (This causes all headers and details in that batch to be deleted.)

Protocol for deleting header

  1. Set the batch key in the batch view.
  2. Read or Browse/Fetch the batch.
  3. Set the header key in the header view.
  4. Read or Browse/Fetch the header. This causes the audit stamp to be read.
  5. Delete the header. (The details will be deleted.)

Protocol for browsing

  1. Set the batch key in the batch view.
  2. Read or Browse/Fetch the batch.
  3. Get fields from the batch view.
  4. Browse/Fetch the header view. (This will not go beyond the batch key in the batch view.)
  5. Get fields from the header view.
  6. Browse/Fetch the details. (This will not go beyond thebatch and the header keys.)

Process Views (SuperViews)

This class of views is mainly used to implement procedures that do not involve editing data. Examples are consolidation, year-end procedures, and posting a batch.

Typically you set some fields in the View and the call Process to do the operation. These tend to be the simplest Views to use and yet often do the most processing.

What Class is my View?

OK, but now I want to program some set of Views, but what class are they? One way to find out is to look in the application’s xx.ini file. For instance for say A/R 6.1A, look in “Sage 300 Program Files”\ar61a\ar.ini and at the top is an [Objects] section that lists all the View classes for A/R. This is here so standards system components know what protocol to use on a given set of Views. Further you can check how to make the calls by doing a macro recording of the UI doing the operations you are interested in.

Sample Program

The sample program ARInvEntryWinForms (located here) has been updated to exactly follow the insert protocol for a batch/header/detail View with all the steps from above added as comments.

// 1. RecordCreate batch to get the next available key for the batch.

// 2. Set the fields in the batch.
arInvoiceBatch.Fields.FieldByName("BTCHDESC").SetValue("My new batch", false);

// 3. Update batch. (The batch record is already inserted by the RecordGenerate call, in step 1.)

// 4. RecordCreate header to get the next available key for the header.
//    The key for the batch in the header view is assumed to be set already.

// 5. Set the fields in the header.
arInvoiceHeader.Fields.FieldByName("IDCUST").SetValue(custNumber.Text, false);

// 6. RecordCreate detail to initialize the fields.

// 7. Set zero (or last detail number) into the detail key field to insert from the start.
arInvoiceDetail.Fields.FieldByName("CNTLINE").SetValue(0, false);

// 8. Set values in the other detail fields.
arInvoiceDetail.Fields.FieldByName("IDITEM").SetValue("CA-78", false);

// 9. Insert detail.
arInvoiceDetail.Insert();   // Insert the detail line (only in memory at this point).

// 11. Insert header. (This will do a Post of the details.)


This was a quick introduction to the View Protocols on how to do basic CRUD operations on the various classes of Views. These are all one level descriptions, but can be generalized to more complicated cases like header-detail-detail views where the middle detail acts as a header to the second detail. Generally the ideas of these protocols will be used extensively in all future articles.

Written by smist08

November 2, 2013 at 6:22 pm