Some thoughts on Mass Data Creation and Change in SAP
By Clinton Jones on February 16, 2011
A number of conversations recently with customers reemphasized the significance and risks associated with mass change in SAP environments and this prompted me to consider how we address safety measures associated with using Winshuttle products in such scenarios. I thought it might be useful to revisit these as we plan for the upcoming releases of Transaction and Runner and in particular consider new features and assess existing ones.
Here are some items to consider on this topic, this is not meant to be prescriptive or exhaustive but should give you some ideas about deployment and sustainment of mass change and create scenarios in your environment.
Recordings and Prototypes.
Most, if not all customers should have at very least a development, quality and productive environment. Some customers will also have what they refer to as a staging or pre production environment which is sized and populated with like-production data. Such environments, assuming they have the same configuration, are better choices for doing your recordings and prototypes.
You can help yourself by protecting your productive data and productive environments in Transaction by furthermore setting the productive, non productive values in the options section of Transaction. If you’re using a Central based application licensing model this is extended further and in fact administrators can prevent the use of scripts against defined productive systems unless the scripts are approved for use in such circumstances the RUN button is disabled and you can only select the TEST button.
There really is no alternative to thorough testing of your data change and create scripts and this should always be done in non productive environments. If you are authoring scripts also consider testing just a few rows in your productive environment rather than trying to ‘break’ the script in production. Generally a script will execute without issue if it passes muster with just a few data lines. There are always exceptions but it is better to be cautious than careless.
It is also important to emphasize though that script portability for SAP transaction recordings at least, need to have common configuration or environmental conditions for users who run scripts configured or recorded on other systems. Some transactions that you will want to apply specific consideration are transactions like FV60 FB60 FB65 FMBB FMCIA FMSA FMSB GJEE FB50 FBCJ FB10 FB70 FB75 FV70 FV75 MM02/03, these specific transactions, among the many thousands in SAP, are commonly used and have screens that may pop up or not depending on what you have made as selection earlier in the day or as part of your preferred use of that transaction. Since transaction recordings are procedural, you may encounter errors on first runs if your environment is set up differently to those of the person who recorded the transaction. Some limited remediation is achieved through the use of the SKIP SCREEN IF NOT FOUND feature in the expert mode of the mapper.
Validate before you Update
A recently implemented feature which again assists in the quality of data create and change activities is the ability to enable pre Validation of data. This feature is enabled in the expert tab of the mapper and will result in the appearance of the VALIDATE button at run time. There are again some limitations associated with this feature. For PA30 and PA40 transactions in particular the multi-commit characteristic of these transaction screens may obviate the ability to leverage this feature. Additionally scripts that have been recorded with multiple commit statements in the end to end process will not support validate properly. Examples might be transactions where a header needs to be created and saved before lines can be added. Barring these scenarios, the advantage of this feature is the ability to validate material numbers, attributes, string lengths (descriptions) and account posting codes. Experiment with this feature for best effect. Financial posting users will perhaps be a little frustrated by the fact that SAP doesn’t check for full debit=credit checking until an attempt is made to actually post the document but there are a variety of ways one can identify this problem using formulae in the workbook. The validate feature essentially runs the full transaction with the commit or save statement disabled. From a system administration perspective this may reflect in sm20 as a canceled transaction so awareness of this may be useful.
Backing up your data
Current versions of Transaction with mapping to Excel as a data source support the notion of being able to back up the screen data before changing it. This feature may prove to be invaluable to you if you frequently change large quantities of SAP data. Consider that there are some limitations on this capability, it doesn’t work with GUI scripting and BATCH modes. Also consider that your script may take a little longer to run because you need to read the SAP structure before you update the fields. The peace of mind that flows from this feature may however justify its use. The feature is enabled as an advanced run option and can be set in the Rich Client but NOT in the Excel Add-In. The author of the script can also enable this feature and save it as a characteristic of the syndicated or published script. Large data change workbooks may grow exponentially with this feature enabled so be cautious. The backup sheet is a replica of the sheet from which the data changes will flow and is merely a copy of the data in the SAP screens.
Also consider always saving your Excel workbooks that you run, as separate workbooks rather than recycling the same workbook over and over. This small procedure will ensure that you preserve the run log associated with the data you created or changed. Additionally this feature of the log shows the name of the script, the system it was run against, the credentials of the SAP user and the date and time of the run. Preserving this artifact may help you with an internal or external audit.
SAP systems are generally relatively tolerant when it comes to large volumes of data creation or change, they are after all designed to support this scenario but there are a few caveats. This assumes that your SAP environment is appropriately sized in terms of hardware, appropriately maintained and patched and that there are no abnormalities in terms of the way the system is configured. Even if all of these topics are cleared as potential red flags you should consider your fellow users and all the other business processes that are dependent on your SAP system and accordingly schedule particularly large jobs for off peak times, weekends or evenings when the system is perhaps less in use. Consider also keeping administrators and other key stakeholders on your plans to execute large jobs. In environments where Central licensing is applied to Transaction users it is possible to restrict large job execution centrally. Large parallel runs of mass creation and change of data not only consume SAP dialog processes they can also result in high SAP server resource utilization, database process thread pegging and extraordinary record locking conflicts. Testing under load in your non productive environments will give you some ideas as to whether your scripts may cause any of these.
Segregation of Duties
In environments that don’t have Central licensed Transaction and Runner management you have to rely on the basic options of publishing Transaction and Query scripts in workbooks and having employees and participants shared workbooks via email or shared drives on your network. For smaller operations this may be effective enough however this way of using Transaction doesn’t scale very well and large and small customers alike, are encouraged to consider the use of Central together with the Transaction and Runner clients. Central comes with a basic workflow available that will allow one person to create a script and another to approve that script for productive use. Additionally, scripts can have the data review process enabled which would require one person to create the data, another to approve and run the data or simply approve the data for the creator or some other to run into SAP. If you have a more complex requirement then Central with advanced workflow options may also be a possibility. The advantage of this data review process should be somewhat obvious in the context of gatekeepers. IF you have less experienced employees creating data you may want to curtail the risks to your SAP system by always having a more experienced supervisor check the data before it is posted. For large mass change and create activities in highly regulated industry sectors this may be an unavoidable step you need to put in place. The good news is that Winshuttle Central will help you do this for Transaction and Runner.
I hope that these ideas and highlights of features prove to be helpful for you in understanding how you can improve your mass change and create activities and mitigate some risk. If you have thoughts on other ways this can be done with Excel formulae or work practices, please do share these thoughts and of course if you have feature ideas that you wish the products supported be sure to post these in the product suggestions section of the Winshuttle Community Forums.
About the author
Clinton Jones is a Director for Finance Solutions Management at Winshuttle where he has worked since 2009. He is internationally experienced having worked on finance technologies and business process with a particular focus on integrated business solutions in Europe, the Middle East, Africa and North America. Clinton serves as a technical consultant on technology and quality management as it relates to data and process management and governance for finance organizations globally. Prior to Winshuttle he served as a Technical Quality Manager at SAP and with Microsoft in their Global Foundation Services group.
Questions or comments about this article?
Tweet @uploadsap to continue the conversation!