Archive for February 2009
As we develop the web based version of Accpac, we are having to develop a lot of knowledge about hacking and web based security exploits. Learning about these things is very interesting but also very eye opening and scary. Once you expose your application to the web you are definitely in a different world.
We currently offer an on-line version of Accpac through http://www.accpaconline.com, but here we are using citrix server and relying on Citrix and Microsoft to protect us. You first have to login to a Windows domain and then these services provide the secure connection. Of course we do have to be vigilant and make sure all security patches are installed promptly. We also have to keep auditing our server configurations to ensure they are secure. We have to ensure there are the minimum services running that can be exploited. We have to ensure that a pissed off user can only damage their own data and things of that nature.
Once we leave the comfort of the Citrix/Windows Domain world and expose our log-in page as the first line of defense, then we are in a very different world. We now take on responsibility for new level of security, where all the previous threats still exist, but now there are a great many more. It will be the hacker living in their parents basement patiently staying up all night trying to get past our login screen; patiently probing and trying to find any crack that can be exploited; putting together many disparate bits of information to learn how to get past our security.
We need to now deal directly with denial of service (DoS), sql injection, cross site scripting (XSS) and many other forms of attack. We have to train our QA department how to be hackers. How to think like hackers and how to probe our software to find these cracks before the black hat hackers do. Our programmers have to be vigilant in everything they do, about what are the security concerns.
Certainly this is going to be a very challenging journey, but a very rewarding journey. The challenges are very unique and interesting. It certainly adds an extra dimension to software development, beyond just getting the functionality correct (challenging enough), but to be correct and secure.
Often in large software processes, when an error is encountered the process stops dead with an error message. This can be really annoying if the program is controlling the operation of your car. With ERP packages this can be equally annoying but not quite as deadly. If running a long running process that affects item costing, posts to A/R, G/L etc. If that process doesn’t complete it means your A/R, G/L and costs are not up to date. Even if it stopped because of bad data, the client is still stuck. Worse if it involves posted transactions that can not be edited anymore then what do you do?
It would be better if the process made note of the bad data, perhaps making a log of it and then proceeding past it. Then at least everything is upto date with that small bit of data aside. This makes dealing with the bad data a much lower priority task.
The next level is that rather than just log that something is wrong, can the program take the initiative to fix it? Is there a way to make some reasonable assumptions and proceed. Even if wrong, often to correct the data then only needs adding an adjusting G/L entry. Versus perhaps editing the data in the database and making worse assumptions than the program might.
Certainly stopping long running processes dead with an error is annoying especially if it was meant to run over night and only a small part of the processing got done. At a minimum a program should do as much work as it can without stopping, only leaving the exceptions to be dealt with separately. Being able to heal the data so the user doesn’t have to would be a huge benefit. Certainly something to strive to. There are established processes for dealing with these sort of things like FEMA and FTA. These are usually used in aircraft and automotive software engineering, but applying them to ERP and CRM systems should be a good thing also.
I just returned from a trip to South Africa and Zambia. Certainly makes you appreciate the technology infrastructure we have here. Hooking my laptop up to WiFi was always problematic, always for a charge and always very slow. Talking to our business partners there, they said you can’t expect customers to have email addresses – most don’t. Cell phones are very prevalent though and generally the best infrastructure (there aren’t any landlines to most buildings). The Africans are very sensitive to bandwidth since it tends to be limited, and requite all business applications to work in this environment efficiently.
The hotel I stayed at in Zambia was very modern and very nice to stay at; however they were still using hand written receipts with carbon paper at several locations to do room charges. The main restaurants had the usual bill/receipt printers, but that was it. However there still seemed to be good cell phone coverage everywhere. And this in a country where I only saw one paved road, all the rest being dirt and quite bumpy.
The South Africans are currently building a new fiber optic link from India to Mtunzini (just north of Durban). Their plan is to greatly increase the countries overall bandwidth in time for the 2010 World Cup being held there. They want to make sure everyone attending can update their Facebook, send/receive emails, etc.
Anyway its interesting to see technology being deployed to these developing countries fairly rapidly. Hopefully this will enhance their education system and allow further self sufficient development. South Africa already has a thriving software development industry, perhaps as infrastructure is deployed to neighboring countries this can spread too.