Search This Blog

Tuesday, June 21, 2016

What are your Developers doing?



I recently wrote this article and published on LinkedIn.  I've reproduced the article here after obtaining permission from the author (that's me)...

There's no argument.

If you are building a business application today, then it must be a web application.  Not a web site, but a web application.

I found this tongue-in-cheek article on the web (tilomatra.com)...

I agree, I can’t keep up. I just finished learning backbone.js and now I’ve found out that it’s old news, and I should use ember.js, cross that, it has opinions, I should use Meteor, no, AngularJS, no, Tower.js (on node.js), and for html templates I need handlebars, no mustache, wait, DoT.js is better, hang on, why do I need an HTML parser inside the browser? isn’t that what the browser for? so no HTML templates? ok, DOM snippets, fine, Web Components you say? W3C are in the game too? you mean write REGULAR JavaScript like the Google guys? yuck, oh, I just should write it with CoffeeScript and it will look ok, not Coffee? Coco? LiveScript?  DART? GWT? ok, let me just go back to Ruby on Rails, oh it doesn’t scale? Grails? Groovy? Roo? too “Springy?” ok, what about node.js? doesn’t scale either?? but I can write client side, server side and mongodb side code in the same language? (but does it have to be JavaScript?) ok, what about PHP, you say it’s not really thread safe? they lie?? ok, let me go back to server coding, it’s still Java right? no? Lisp? oh it’s called Clojure? well, it has a Bridge / protocol buffers / thrift implementation so we can be language agnostic, so we can support our Haskell developers. or just go with Scala/Lift/Play it’s the BEST framework (Foresquare use it, so it has to be good). of course we won’t do SOAP and will use only JSON RESTful services cause it’s only for banks and Walmart, and god forbid to use a SQL database it will never scale...
Whilst it is obviously a humorous article, the underpinning facts are true - there is a myriad of web technologies available and the rate of change is not slowing.  So building a business-grade web application is very challenging because you require developers with many skills (see infographic below) and once the application is in production, it will need to be maintained and that's where I see massive cost.  Why?  Because developers like to have new stuff to learn and use.  Once an application is in Production, its technology profile is set and, in a year's time, the developers will be using the latest shiny hubcaps - so the business is faced with "instant legacy systems" and perhaps a lack of developers with the requisite skills.


The above infographic is from http://www.skilledup.com/articles/web-developer-job-descriptions-skills-they-require based on a study of over 28 million online job postings from the period of May 2013 to September 2014.

So it's no joke, web developers need up to a dozen languages/technologies to do their job.  Or do they?

But maybe there is a different approach...

I work for LANSA, so I'm putting that out there.  However, LANSA has always produced development tools that provide "technology insurance" for the business.  This means that with the one technology - Visual LANSA - you build your modern, business-grade application for the web.  This will not be popular with web developers, who will argue until the cows come home that they absolutely must have this widget or that framework - it simply is not true.

Developers who want to deliver the best outcomes for their business - fast, robust, modern and maintainable - will see the value in this approach.

On the business side, the answer has been to outsource the problem - sometimes overseas.  This may address cost, but not time-to-market, high quality, maintainability, etc.

Take a moment to think outside the box about what your business needs when developing and maintaining their applications - whether it's for use within the organisation or with their customers.

nuff sed...


Monday, October 26, 2015

Configuring a network printer on Linux

I have a printer (in my case a Brother MFC-990CW) - a combined printer & scanner.

I did the usual things - went to the Brother Support site, followed the pre-installation steps, downloaded the shell script and ran it.

Opened up System Tools>Printers (I'm using Lubuntu on the netbook involved).

Added a network printer, it finds the printer as shown below:



Then I go on to select the driver and it looks hunky dory - except, it doesn't print.  I get jobs on the print queue saying cannot connect to printer.

So what gives?

In my case, after trying quite a few things, it turned out to be very simple.

It wasn't resolving the network name - BRN001BA96E9A3A

I have the printer on a fixed IP address, so all I had to do was replace the printer network name with the IP address - then it all worked.

I hope I save some people some time.

Friday, October 23, 2015

Integration Discombobulation

“The state of confusion brought on by the overwhelming challenge of the flow of data between disparate applications both within and without your organization.”

There is a growing concern among organizations of all sizes, that things may be “falling through the cracks” – why is this so?  From my experience, there is the pressure from business to “just get it working” – which we do.  Sometimes we have the luxury of a reasonable budget, ample time to elicit full requirements, design the outcome and create the solution… Sometimes…

But that is not always so, particularly when things change AFTER the solution is in place, as the business tries to slim down on operational costs to maintain profits.

This leads to a variety of bespoke solutions.  Have you ever seen any of the following solutions?

¤         People create “macros” in tools such as Excel to “manipulate” data into a desired format.
¤         Data is sent/received as attachments in emails.
¤         Data from files/reports is entered “by hand” into your systems of record.
¤         Information is sent to the wrong party – which can be embarrassing or worse, breaching a commercial confidence.

And when things go wrong, as they invariably do, what happens?  Sometimes the consequences have minimal business impact, a simple re-sending of an email may suffice.  But what happens when the excrement hits the air movement device?  There can be business costs, penalties, even loss of clients.  Lawsuits, loss of confidence in your organization’s products or services – the list goes on.

It is high time that organizations realize that the flow of data – either within (between an organization’s applications) and without (with your trading partners) needs to be managed and controlled in a standard, reliable and robust manner.  There have been many tags given to this, one of which is the Enterprise Services Bus or ESB.  However many of these types of solutions do not capture the “lower level” manual processes and automate them.

Several popular tools purport to address this problem space.  Unfortunately, these solutions become more complex than the problems they are attempting to resolve.  The solution you require must allow Business Analysts to create process flows without the need for programming.

Let’s take a look at the 4 key areas that such a solution must provide…

1.     Transportation – by what means will the data move from application to application; organization to organization?
2.     Transformation – what data format is required so the recipient can process it without error?
3.     Orchestration – how will process flows with multiple steps/stages by managed, and what happens if there is a failure?
4.     Administration – how will process flows by designed, created and deployed? Who can update process flows? Who manages the operational aspects?

Transportation must be able to handle modern methods such as RESTful web services as well as older methods such as File Transfer (FTP), email (POP3, SMTP), etc.
Ideally, we should be able to create a profile for each trading partner, or internal application, that defines the method of transport they use.  In this way we can have multiple trading partners sending data using different methods of transportation.

Transformation must allow the business analyst to map the data format of the trading partner to that of the receiving application.  And there are 2 aspects of this:


1.     The format of the data may be different – for example the incoming data may be in an XML structure, but the receiving application requires it in a table in a RDBMS (Relational Database).
2.     The actual incoming data elements need to be mapped to their receiving counterpart.  There may be other functions required to manipulate the data element.  There may also be the need for data augmentation or enrichment.
All this needs to be done without programming by Business Analysts.

Orchestration must control the steps in a process flow.  This may mean that the flow needs to fork and perform multiple parallel flows, or it may be a simple, sequential series of steps.  It also needs to handle exceptions or error conditions, when (not if) something goes awry.  Although not strictly Orchestration, the solution should – where possible – allow a process flow to be resumed at the point it failed, once the issue has been resolved.

Administration must provide a management framework to secure the creation and amendment of process flows.  This includes the deployment into Testing, Quality Assurance, User Acceptance Testing and Production environments.

Summing it all up

Integration means that data flows from the provider and it is delivered to the recipient in the format and method such that the recipient can process it.  And human intervention must be removed from this process flow – it must be automated and managed.

If you like, it is Business Process Integration and Business Process Automation rolled up in one.


Wednesday, June 24, 2015

Business Process Automation & Integration

There is a heap of information on Business Process Automation - so I'm not going to go into great lengths to talk about the different nuances.

Instead I'm going to talk about a few hypothetical cases...


Case 1 - incoming orders


Let's suppose you work in a business that provides some form of product.  Some of your buyers send through orders via email.  Usually the email has an attachment.  It could be an Excel spreadsheet, a CSV file or some other format.  You need to get this information into your company's ERP or in-house application - henceforth let's just call it ERP for simplicity's sake.  Perhaps you have set up an special email address to which your clients can email their orders - let's call it orders@mycompany.com

So it's somebody's job to periodically check this Inbox to see if there are orders to be processed.  If there is an email, then the attachment is saved.  Then you edit the attachment into a specific form that will allow it to be loaded into your ERP for processing.  You need to check that the load process worked OK and maybe you have to send an email back to your client to advise them that their orders were accepted and perhaps even give them tracking information.

There's a lot that could go wrong.  But the main problem is that we are relying on a person to manually tend to the whole process of getting the data into shape, loading it, checking the result and communicating back to the client.

There are 2 major processes involved here - the automation of multiple manual steps and the integration into your company's ERP.

Let's deal with the automation side of things first.

In Case 1 we need to first handle how the data gets to our company.  At a high level, we call this Transportation - how the data is moved.  We see that email is the method of transportation, or more technically it is SMTP & POP.

Next we have to consider the data "payload".  And there are 2 aspects to this:
  1. What file type is being used, and
  2. What data format is being used
In Case 1, let's assume it is a simple text file using CSV data formatting.

So we need to create a process that reads emails coming into a specific Inbox, then detach the file for further processing.

However, it is unlikely that the incoming format is exactly what we need in order to load into our ERP.  So we need to munge it into shape.  Quite often this is a very manual process and prone to errors.  We need to map the incoming data elements to the format required by our ERP.  This involves the Transformation of the data.  It is also where the Integration part of this blog's heading comes into play.  In a Transformation process we are concerned about getting the data into the right shape and moving it to the right data element on the receiving side.  Now this too may vary just as the incoming data varied.  We could be loading it into a flat file that is read by a loader program.  Or we could be directly updating an Incoming Orders table in our ERP, where it will be processed by an existing mechanism.

In some cases we may need to create the flat file, and then invoke the loader program as a subsequent step.

As you can see from Case 1, we have a few steps that need to work together, much like the separate musicians in a band - so we call this Orchestration - ensuring that the process steps work together in a flow that produces the desired outcome.

And to control the creation of these Processing Sequences and who can access them, we need the management layer - or Administration

Case 2 - Outgoing Notifications

In much the same way as we handled Incoming Orders, we will handle Outgoing Notifications.  We need to create a Processing Sequence that handles transportation, transformation and orchestration.  Our ERP may have finished processing some orders and we want to notify the client of the outcome.  The client expects us to FTP an Excel file to a specific folder.  Our ERP creates the XML files in specific client folders - each client has its own folder.  The process will watch for files being created in this folder.  When a file is detected, it is transferred to the client's remote folder via an FTP connection.  We may have agreed to convert the XML into an Excel spreadsheet for this specific client.  After successfully moving the file, we move it to an Archive folder for this client and we send the client an email advising them that a file has been FTP'd to them, along with any other pertinent data.

Case 3 - Application to Application Integration

In our last case, let's look at intra-company process flows.  Quite often a company will end up with multiple applications running on different platforms.  Some will be older, perhaps authored in-house, but comprising the very core of the company's systems.  New applications may be acquired - sometimes through acquisition.  Rarely do these applications work together without some form of integration.  What we need is something that can handle both older methods of exchanging data (like file transfer) with newer methods such as web services - both SOAP and RESTful.

However, your business processing sequence will still need to have the same building blocks - handle the transportation method from one application, transform the data (for the receiving application), transport it into the receiving application and then kick off the processing.

Summary

To wrap it up, you want a solution that a business analyst can use to define your business process flows - think of it as workflow for applications.  You don't want to be writing bespoke programs that provide point solutions, as these will just add to your maintenance overhead.  You want a centralised, secure, code-free solution that handles multiple platforms, data formats, transportation methods in a secure, repeatable fashion.

If this sounds like something your company needs, then I suggest you have a look at LANSA Composer and automate your manual processes.









Friday, January 10, 2014

SmartWatches - Geek to Mainstream in 2014?

They won't look like the watch from the old cartoon series Dick Tracey - they'll be better!


There is a growing wave of SmartWatches out there.  Kickstarter has the Pebble watch
and the Agent watch

and the Geak 



And every man and his dog are releasing them - like Sony, Samsung, etc.

But nobody seems to have cracked it yet.  They all want you to touch the itty bitty screen with your blobby finger - hmmm.  I think it needs to mature and perhaps be gesture based or voice driven - but it will need to recognise your voice AND understand what you are saying.  Think about if a stranger says "delete all emails" and your watch says - "yes sir, right away!"

So I think there is some maturing yet, but with today's crazy-fast innovation cycles, it shouldn't take too long - over and out!

Monday, May 6, 2013

Application Mining

There's much talk today about Data Mining and Big Data.  What I don't see discussed much is Application Mining.

OK, so what is it?  Well in my thinking, it's where you have applications that are effectively performing 1 or more business functions for your organisation.  They aren't the "rising star" applications - more the "cash cow" applications.  You have already invested in them, you maintain them - they just do the job. Some people call them Legacy Applications.

Quite often they are the "green screen" applications - great for data entry, built in a time when GUI sounded like something sticky that you didn't want to accidentally sit on!

The value here is the data AND function.

So how do you leverage this corporate asset?

Intelligent Screen Scraping

Sounds awful - scraping the screen.  Let's use another phrase to cover this - "screen interception".  There's a few products that can intercept the information from the server that was normally displayed on a dumb terminal or in some form of terminal emulator running on the user's PC.

After intercepting the screen instructions, the software can display this to the user in a browser.

This "out of the box" transformation already delivers a benefit of not having to install any special software on the user's machine.

The next level of sophistication is to re-model the browser pages to leverage the power of using the browser and form objects such as date pickers, drop-down lists, etc.  At this point we should also be contemplating how we could revamp the process so that the user interaction is streamlined and designed for ease of use.

We can do this by marshaling multiple "green screens" behind the scenes and presenting a single web page to the user.  This more than compensates for the lack of "data entry speed" that some consultants claim is lost when moving from a character-based screen to a modern web page.

It is also possible to set up a framework that allows you to move some functions out into a web template and better utilise the server functionality.

Delving Deeper Into the Application

Once we have mapped our screens in this way, we can extend their use past the browser.

This can be done by combining screen interactions under a function and making it available in a SOA environment.  There are various approaches in making the function available - it could be by exposing it as a Web Service with a SOAP interface or, more likely, a JSON interface.

It could even be made available to less sophisticated systems running on different operating systems or even different networks.  This is where a product like LANSA's Integrator comes in handy.  LANSA have done a lot of good work in continuing to refine, and add new functionality to, LANSA Integrator.

Of course there are other products out there - so it's a case of "horses for courses".  But is you have an IBM i (aka AS/400) environment, then Integrator is the go.

Saturday, March 16, 2013

Removing Thule Roof Racks from Toyota Corolla - Rapid System 317

OK, this is Not an IT post, but I hope it helps anyone that is in the same position that I was.

I bought a Toyota Corolla Ascent 2004 (through work) and optioned it up to include a Thule roof rack.  The guide for installing the roof rack can be found here.

Even this guide didn't help that much.

You get keys to unlock this roof rack.  One set has the wavy edge you expect to see on keys.  In my case I had 4 of these in the pack (don't know why as they were all the same).  But there is also a slightly larger key with straight edges - it is important.

Most of the Thule roof racks have the keyhole in the "base" - the bit that sits on the roof of your car, but not this system.  It has the keyhole in the end of the rack.

I removed the whole key assembly buy:
  1. unlocking it with the wavy edge key
  2. pulling that key out, then inserting the straight edge key - all the way in
  3. then I pulled the black end, with the keyhole in it, outwards away from the body of the car - this allowed me to push the inside of the key mechanism and it just pops out
You probably don't have to remove the key mechanism like I did.

With the black end bit pulled out a few centimetres, but still in the rack,  start turning it counter-clockwise like you are unscrewing a top or a nut off a bolt. The whole mechanism starts to loosen.  Do this on both sides, then pick one side to loosen off enough to clear the edge of the roof and then lift to remove.

OK, so when you "get it" - the process is easy.  But I couldn't find any instructions on how to remove it on the web - and believe me I tried.

I hope this helps someone out there that searches for an answer!