Increase Big Data’s Value with Monitoring Systems
By Clinton Jones on June 10, 2014
For a portion of my career I worked in the transportation and logistics industry. I was responsible for working with a couple of technology vendors to track vehicles and articles of inventory or equipment.
This concept of the real-time location system, or RTLS, became popular as an industry label in the late 1990s and actively began to be used at events like ID EXPO. The technology effectively required a matrix of RFID tags, transceivers and, in particular, the instrumentation of heavy equipment in facilities with transponders and other sensors that communicated back to a central hub and data store. A great deal of inference was made about the state of things based on sensor responses.
These days we use similar technology to track children (using smart wristbands), chip our pets, livestock, wildlife, the weather patterns, seismic events, and even the temperature in our houses. We track the speed and location of vehicles, and we even track ourselves with the assistance of mobile phone apps.
Of course the idea of locator systems isn’t particularly new. In fact the aviation industry has been using this technology since prior to the First World War.
Initially, route location was performed with self-identification of where you were on your route using a variety of types of marker beacons. Radio was quickly introduced, and a variety of beacons and transmission and receiving systems have since helped pilots in the navigation process.
In the 1960s, ILS, or Instrument Landing Systems, started to be introduced to assist with landing approaches. As a consequence, a great deal of flying these days is done using a combination of cockpit instrumentation in conjunction with the flight management system and section charts.
GPS receivers and other electronic wizardry in general all play bit parts in the route management, and the role of air traffic control is also significant in both keeping the pilot informed and keeping track of who is in which areas of the airspace. This has resulted in sentiments that the romance of the commercial airline pilot has been lost, and that pilots’ handling skills have become degraded since the introduction of fly-by-wire, glass-cockpit, fully automated, system-designed aircraft.
This is a particularly interesting view given the relatively recent 2009 aviation event where Captain Chesley B. (Sully) Sullenberger, III ditched his plane in the Hudson river, effectively saving his crew and passengers through taking extraordinary actions that had nothing really to do with all these newfangled flying tools.
One could further add that the regimented nature of many flight operations, the growth in airspace control and general availability of Instrument Landing Systems (ILS) make flying a more standardized procedure as compared with the art that it was often considered in the past.
Some of the gloss of aspiring to be a commercial pilot may therefore have worn off as pilots now become thought of as specialist computer operators. Incredibly though, there’s a discipline here that is akin to how we should be managing our interaction with systems of record and data stewardship in general.
Data black holes often arise due to data feed failures
RTLS, like all the technology used in Aviation tracking, is all about positioning and placement of objects. We used the technology to deal with shipping container yards the same way as you would use a warehouse management system to identify where you put a box or package of goods. Doing this properly and correctly required all the sensors to be working and storing big quantities of data in a database where an analytical engine could make calculations about where things were. Every sensor ping and response was logged and formed part of a calculation.
In our RTLS ecosystem, just as in the aviation ecosystem, every component has an explicit identity and role. If one or more fail, then there’s a black hole in terms of information and a conclusive statement cannot be made about the disposition of the object or equipment. No conclusions can be made until normal transmissions are resumed.
The world of big data known unknowns
Big data analytics is pretty much the same except that the ecosystem is not necessarily as “cut and dried” – the sensors may take many forms and may be evolving independently of those that monitor, track and analyze the data.
Companies usually have a great deal of data about their operations, but they can lack knowledge on the location of their finished goods, mobile equipment, reusable containers, vehicles and other assets at any moment in time. Sales data, for example, may be multichannel, and things like “after-market” sales of objects may not be very easy to track and yet form an important part of understanding the market and opportunities.
My little big data world
I found this particularly true this week when I decided to extract some data from our SAP IDES system.
At one stage several years ago, it was very important for us to put some control around how our demonstration data looked in the IDES system. There’s nothing more embarrassing than extracting data as part of a demo and having the system return complete garbage because someone has been playing with the data as part of training or experimentation.
Because I haven’t tracked this data in any way for some time, I was horrified to find how much rubbish data we had accumulated. My lack of monitoring and lack of knowledge of who was playing in our systems left me in the dark with just how bad our demo data quality is.
I am studying just a tiny sliver of data in our system. When you consider the thousands of tables and processes homed in an ERP system, it is conceivable that I have only just scratched the surface in terms of identifying issues.
Fortunately, many of these issues can be addressed using a combination of Winshuttle Studio Query and Transaction scripts, but it does make one wonder at how effective big data analytics can be when there are so many sources of data, so many incidents of black hole data and a lack of flight paths for tracking data in ERP systems.
Winshuttle helps the audit process
I am currently having several parallel conversations with customers trying to understand how they are using Winshuttle to help in auditing the data in their systems. If you are doing some work in this space I would love to hear from you and in particular to hear about what kind of data monitors you have in place.
All indications are that in the finance space at least, your ability to demonstrate sustained data management and monitoring around key transactional and master record events can save your business money not only operationally but also at audit time.
About the author
Clinton Jones is a Director for Finance Solutions Management at Winshuttle where he has worked since 2009. He is internationally experienced having worked on finance technologies and business process with a particular focus on integrated business solutions in Europe, the Middle East, Africa and North America. Clinton serves as a technical consultant on technology and quality management as it relates to data and process management and governance for finance organizations globally. Prior to Winshuttle he served as a Technical Quality Manager at SAP and with Microsoft in their Global Foundation Services group.
Questions or comments about this article?
Tweet @uploadsap to continue the conversation!