- Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
- Which department or person should play the key role in defining master data and assuring it’s quality?
- Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
- Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
Penghui Ai says
1. Master data in an ERP system is highly integrated with various processes and affects many parts of the organization. How does an organization assure this integration works well for all?
Master data – Data that is centrally stored (shared across application modules) and is processed to eliminate data redundancy. It remains constant over time, but we need to update it on regular basis. For example, the Vendor is a type of master data that is used for creating purchase orders or contracts. Controls that help an organization assure this integration works well for all:
• The process to define the ‘true’ data-leveraging external data, business policies where possible.
• Trained ‘maker-checker’ enters/maintains the data including independent verification of source data.
• Routine (quarterly?) review of critical field values changes (changes correct, authorized).
• Segregate ‘maker-checker’/maintainer access from those performing the process (transactions).
2. Which department or person should play a key role in defining master data and assuring its quality?
I think the department or person, plays a key role in defining master data and assuring its quality, should depend on the data itself. For example, the customer master data should be defined by the sales department, and they should assure it quality, so finance/accounting can use the correct data. For the purchase, the department defines master data and assure the quality of material master data.
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
I think inaccurate data is more of a risk to a company compared to excessive repetitive data. If the decision maker uses inaccurate data, the decision made could be incorrectly leading to financial loss and depletion of resources. In addition, inaccurate data is a material mistake to the financial statements, which could result in lawsuits and reduction of stakeholder trust/investments. In addition, there are many tools that can be used to filter and get rid of repetitive data, whereas it is hard to detect and/or correct inaccurate data.
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
After doing some research, one of the transactions that I found which should be focused in an SAT audit is that of transaction FI12. This transaction allows the users to edit the bank account to be used within the SAP system. This bank account is used within the system to make and receive payments. For someone to be able to change this, you run the risk of fraud or theft. This seems to be a risk that comes right out of the movie Office Space, where a couple of employees changed the program to send themselves the rounding differences of all transaction in the company to their bank account. Due to this, this transaction should be limited in who can access it and therefore be one of the more sensitive transactions within SAP.
Reference: http://www.saponlinetutorials.com/define-house-bank-in-sap-house-banks-overview/
Mahugnon B. Sohou says
I have to disagree with your answer to question 2. I don’t think that the person who should play this key role should depend on the type of data. I think like I said in my post that the person in charge of that should be a dedicated user, a database administrator whose sole purpose is to control the database and assure the quality of the data. If people from different departments had access to the data from their department those employees could not work independently, and they could still make unauthorized changes.
Rouying Tang says
Hello Ai, thank you for your posing. I do agree with you on the third point that the inaccurate data can bring more risks to a company. It is pretty thoughtful to consider the consciousness and detection of errors.
Haitao Huang says
Finance and accounting department should work together to determine the master data because those two departments are directly involved in the business processes and more familiar with various requirements in each business process related to the ERP system. While the finance and accounting departments define the master data, the audit department should take responsibility in ensuring the confidentiality, integrity, and integrity of the master data, as well as the compliance with the controls related to the master data.
Deepa Kuppuswamy says
Hi,
Just a quick correction to your second answer. You have stated that “customer master data should be defined by the sales department”. From my understanding of Order to cash process, I guess that the customer master data is not defined by the Sales people and I think it would pose segregation of duties conflict. Though, Customer master records are used by both the Financial Accounting (FI) component and the Sales and Distribution ( SD) component.in few of the companies, master data cannot be created by Sales personnel.
Imran Jordan Kharabsheh says
1. Upon the implementation of a modern Enterprise Resource Planning system, a master database needs to be created in order to centralize a majority of key customer and vendor data. This key data can then be used across all modules that use the ERP system, and can be used to reference and pull up other less-essential data from sub-databases. The primary reasoning behind the creation of a single, large master data set as opposed to multiple smaller data sets with shared information linking them is because of the need to eliminate redundancy, which can create inefficiency or mistakes in business processes if there is a discrepancy in entries that are supposed to be similar. In order to improve the integration of the master data across all parts of the ERP system, companies often seek regular revision and authentication of information present in the master data set. Companies will also enforce the segregation of duties, as the information present in the master data set is often vital and can pose a threat in the wrong hands.
2. As I mentioned towards the end of my previous response, most companies that have implemented a modern ERP system see an urgency to enforce the segregation of duties. This primarily stems from the fear that the customer and vendor information present in the master data set is very sensitive and can threaten the livelihood of a company if misused or compromised. An example of a situation you wouldn’t want is someone who creates the purchase orders having access to the vendor master data set. The fraud threat in this situation would be that the person who creates purchase orders might start create false purchases using the vendor information in the data set. So, ultimately, if I had to place anyone in charge of monitoring and reviewing the master data, I would place a dedicated database administrator, so as to minimize the threat of someone in a shared department misusing it.
3. In order to compare the amount of risk between having inaccurate data or excessive repetitive data in your data sets, it is important to note that each poses their own unique kind of threat to a company’s data integrity and both can have devastating effects financially and socially. The primary threat of having excessive repetitive data is that redundancy can make small discrepancies in data (due to updates in one database before updating another) have large financial impacts, and that it can create the issue of orders being put in incorrectly or multiple times. The primary threat of inaccurate data is that it can ruin orders and create issues with customer relations. It should also be important to mention that inaccurate data can also devastate a company’s finances.
4. Among the more sensitive documents and transactions that are looked at during a Sensitive Access to Transaction Audit, I find that the integrity of finance and accounting documents can pose the largest threat to a company’s sustainability. The primary reason for this is because of the sensitive information present in these documents posing a direct threat to a company’s financial well-being, and can easily be used to enact fraudulent transactions if the information fell into malicious hands. Often the best solution to this is the enforcement of segregation of duties, as well as a monitoring system being implemented.
Mahugnon B. Sohou says
I agree with your answer to question 3. Only a dedicated database administrator should play the key role of defining master data and assuring data quality., This avoids any segregation of duties issues and that an employee from another department does not missuse the data.
Deepa Kuppuswamy says
.
Deepa Kuppuswamy says
Sorry! I had problem in posting answers so i was trying to resolve the problem.
FYI: WordPress throws the following error “WordFence 403 forbidden” when you give a special character (:) and a space at the end of your sentence. Please try to avoid this.
Haitao Huang says
1
Mahugnon B. Sohou says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
Once we implement a new ERP system, we need a master database to centralize key customer and vendor data. A Master data is data that is centrally stored and processed to eliminate redundancy, which can create errors in business processes if there are for instance two different versions of an entry that are supposed to be similar. Then these key data can then be used in all modules that use the ERP system assuring that various parts of the organization are integrated and work well. To assure an integration accross all parts of the ERP system companies should also often perform reviews of the information present in the master data set.
Mahugnon B. Sohou says
2. Which department or person should play a key role in defining master data and assuring its quality?
I think the department or person who should play a key role in defining master data and assuring its quality is the database administrator. This the best person for this task to assure that there ie segregation of duty to avoid that for instance someone cannot create false purchases and approving them etc… This minimizes the threat of someone misusing the data.
Mahugnon B. Sohou says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
Inaccurate data and excessive repetitive data both have their own unique threats to data integrity and can both lead to devastating consequences. However inaccurate data can create errors in processes and lead to incorrect decisions being made due to inaccurate data being used. Whereas excessive repetitive data can make small discrepancies in data have significant financial consequences as multiple versions of the same order could be processes at the same time, which in the long run could also lead to issues with customer relationship. There fore excessive repetitive data represents a bigger risk to a company, as it has the potential of causing a bigger loss.
Mahugnon B. Sohou says
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
I think finance and accounting related transactions are more sensitive and should have extra focus in an SAT audit. The reason for that being that a company’s information present in these documents are financial information and could affect the entire organization as they directly relate to the company’s financial well being. In the event that these information falls int the wrong hand, this would leave the door opened for fraudulent transactions
Deepa Kuppuswamy says
1. Global Enterprises has various applications and systems where data that crosses organizational departments or divisions can easily become fragmented, duplicated and most commonly out of date hence these data should be structured, valid, accurate, and consistent. This could be achieved by implementing Master Data Management (MDM) Framework which helps to ensure strategic, balanced and integrated foundation.
Master Data Management (MDM) provides a comprehensive method of enabling an enterprise to link all its critical data to one file, called a master file, that provides a common point of reference and the following key components of an MDM framework helps the teams to maximize business benefits:
MDM Governance
MDM Technology
MDM Standards
MDM Processes
MDM Data Policy
MDM Organization
Deepa Kuppuswamy says
2. When thinking about who should be really involved in defining master data, I believe that the following roles plays a key function in defining master data
Data Governance: These users help administrators know what to create and data stewards know what to manage and how to manage it and also these individuals drive the definition, requirements and solution.
Administrators: Individuals in IT who are responsible for setting up and configuring the solution.
Data Stewards: These individuals are responsible for fixing, cleaning and managing the data directly within the solution. Ideally, data stewards come from departments across the business, such as finance and marketing. Typically, the activities that data stewards take on with defining master data are defined by data governance users.
Mahugnon B. Sohou says
I couldn’t agree more. In my post I mentioned the data administrator as the person who should take on the key role of defining master data and assuring data quality, however you went into deeper details and I think there couldn’t be any answer more correct than that. Great post. Thanks for sharing your thoughts.
Deepa Kuppuswamy says
3. When comparing the risk level for inaccurate and excessive data, I think the risk is comparatively higher due to inaccurate data. A better way to think about it is ‘garbage in, garbage out’, poor data input leads to poor decision making and has a direct impact on performance. Sometimes very small input errors can lead to huge output errors, so the risk associated with inaccurate data is high.
Impacts of Excessive repetitive data can range from a transaction level loss to catastrophic effect for an enterprise, like higher consumption of resources, higher maintenance costs, errors in product/mail deliveries, higher spam counts and un-subscriptions, invalid reports and many more. Avoiding these kinds of bad data completely from a source is virtually impossible. However, organizations can try to keep the data clean by implementing “Data Management” process.
4.SAP has deadly security risk that should be understood by every information security professional. For example: SAP*, SAP_ALL, DDIC, and EARLYWATCH accounts- any transactions performed with these accounts should be monitored and only privileged users should perform these transactions.
If a malicious user connects to a login mechanism of your SAP system (e.g., SAP), the user will be able to use this hard-coded username (SAP*) and password (PASS) to gain SAP_ALL privileges and full control of the SAP system.
As part of SAP security review, we should make sure that default passwords for SAP*, DDIC, and EARLYWATCH have been changed and make sure that SAP* exists and has been deactivated in all clients and these accounts should be added to group SUPER in all clients so that they only be modified by administrators who are authorized to change users in the group SUPER.
Penghui Ai says
I agree with your opinion that inaccurate data is more of a risk to a company compared to excessive repetitive data. In addition to your point, if decision maker uses inaccurate data, the decision made could be incorrectly leading to financial loss and depletion of resources.
Rouying Tang says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
Mater data in an ERP system is comprehensive method to collect, storage and process all data through out the organization. To assure this integration works well, the most critical thing is the usage of uniform language and measurement. We need to make sure the users from different department use one standard to insure the presentation of the data and avoid the misunderstands to make infective or inaccurate decision.
2. Which department or person should play the key role in defining master data and assuring it’s quality?
The chief information officers or chief data officers should define the master data, however the managers from different departments who own and process those data should assure it’s quality.
Deepa Kuppuswamy says
Hi Rouying,
I don’t think CIO would be accessing/updating the master data. When i reviewed the responsibilities of CIO it stated that CIO would be in-charge of information technology (IT) strategy and the computer systems required to support the organization’s unique objectives and goals. CIO is in much higher level who is involved in strategic thinking of the business.
Rouying Tang says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
Both of them could bring negative impact on company, however the inaccurate data is more risky. The excessive repetitive data can occupy the storage volumes and decrease the executing speed. However the inaccurate data can lead to wrong strategies and decision making, which can bring worse impact on the long run.
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
The personal identifiable data should always be attracted extra focus in an SAT audit. Why do we correct those data? Are they necessary? Where would those data be storage and are they safe? Who can get access to those data? How will those data be processed and what is the benefit for us to process those data? All above need to be concerned, because a small error happened on any processes can bring trouble to that alive person in real life. And sometimes the consequence is not affordable. For an company, it is not just an measurable financial loss, but would become an reputations issue.
Yuan Liu says
1. The principle objective of any ERP is to integrate all the functions of a business into a unified platform. Many businesses are still stuck in the unenviable position where their data is stored in multiple locations and systems. By centralizing this data and streamlining the means of accessing (as well as adding to) data, ERP contributes to greater efficiency within a business model. ERP software integrates various processes that are essential to run a business enterprise into one single database. These processes include inventory and order management, accounting, human resources, customer relationship management (CRM), among others. By streamlining all the processes into one effective system, ERP provides your business with a shared database that supports multiple functions across your enterprise.
2. I think Information and data management (IDM) should play the key role in defining master data and assuring it’s quality. Because it forms policies, procedures, and best practices to ensure that data is understandable, trusted, visible, accessible, optimized for use, and interoperable. IDM includes processes for strategy, planning, modeling, security, access control, visualization, data analytics, and quality. Outcomes encompass improving data quality and assurance, enabling information sharing, and fostering data reuse by minimizing data redundancy.
3. I think excessive repetitive data is more harmful for the company development because there ten reasons why duplicate data is harming business: Wasted Costs and Lost Income, Lack of Single Customer View, Negative Impact on Brand Reputation, Poor Customer Service, Inefficiency and Lack of Productivity, Decreased User Adoption, Inaccurate Reporting and Less Informed Decisions, Missed Sales Opportunities, Poor Business Processes. Poor Targeting and Wasted Marketing Effort.
4.I think the transaction from Material should have extra attention from the SAP system, because material is the basic build from lots of company and organization, which means they need material to produce and create service. For example, there are two important transaction: Change Material and Create Material. Information System department workers have to enter each detail into the system. There would be some mistake by entering. And the system will automatically reenter the information each section. company should care about that more on that.
Yuan Liu says
Top 10 SAP t-codes by line of business processed by Winshuttle Studio
Master Data
T-Code T-Code Description Number of Customers Processing time saved by automation
MM02 Change Material 751 86.52%
MM01 Create Material 630 88.20%
XD02 Customer Change 440 79.80%
VK11 Create Condition Records 408 78.60%
XK02 Vendor Change 392 86.70%
XD01 Customer Create 341 93.60%
XK01 Vendor Create 287 89.90%
VK12 Change Condition Records 242 77.80%
KS02 Change Cost Center 178 74.60%
KS01 Create Cost Center 177 94.00%
Reference: https://www.winshuttle.com/how-we-help/top-10-sap-processes/
Mahugnon B. Sohou says
Sorry I accidentally shared my posts in the 401 section instead of the 701 online section
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
Once we implement a new ERP system, we need a master database to centralize key customer and vendor data. A Master data is data that is centrally stored and processed to eliminate redundancy, which can create errors in business processes if there are for instance two different versions of an entry that are supposed to be similar. Then these key data can then be used in all modules that use the ERP system assuring that various parts of the organization are integrated and work well. To assure an integration accross all parts of the ERP system companies should also often perform reviews of the information present in the master data set.
Mahugnon B. Sohou says
Sorry I accidentally shared my posts in the 401 section instead of the 701 online section
2. Which department or person should play a key role in defining master data and assuring its quality?
I think the department or person who should play a key role in defining master data and assuring its quality is the database administrator. This the best person for this task to assure that there ie segregation of duty to avoid that for instance someone cannot create false purchases and approving them etc… This minimizes the threat of someone misusing the data.
Mahugnon B. Sohou says
Sorry I accidentally shared my posts in the 401 section instead of the 701 online section
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
Inaccurate data and excessive repetitive data both have their own unique threats to data integrity and can both lead to devastating consequences. However inaccurate data can create errors in processes and lead to incorrect decisions being made due to inaccurate data being used. Whereas excessive repetitive data can make small discrepancies in data have significant financial consequences as multiple versions of the same order could be processes at the same time, which in the long run could also lead to issues with customer relationship. There fore excessive repetitive data represents a bigger risk to a company, as it has the potential of causing a bigger loss.
Mahugnon B. Sohou says
Sorry I accidentally shared my posts in the 401 section earlier instead of the 701 online section
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
I think finance and accounting related transactions are more sensitive and should have extra focus in an SAT audit. The reason for that being that a company’s information present in these documents are financial information and could affect the entire organization as they directly relate to the company’s financial well being. In the event that these information falls int the wrong hand, this would leave the door opened for fraudulent transactions
Haitao Huang says
1. Question 1
It is very important to protect the integrity and availability of the master data since the data is highly integrated with various processes.
To protect the integrity of the master data, an administrator should set up various check parameters to reduce errors and malicious activities in the input process. For example, the administrator may assign user-specific tolerance and business partner-specific tolerance to each type of users. A tolerance group defines various rules when a user posts data into the system, including the maximum amount for which an employee may post a document, the maximum amount for which an employee may enter a line item in a customer or vendor account, the amount of cash discount (in percent) an employee may grant in a line item, and the maximum amount for which payment differences may be accepted. If the user needs to post a value that is beyond the predefined parameters, an appropriate approve is required.
2. Question 2
Finance and accounting department should work together to determine the master data because those two departments are directly involved in the business processes and more familiar with various requirements in each business process related to the ERP system. While the finance and accounting departments define the master data, the audit department should take responsibility in ensuring the confidentiality, integrity, and integrity of the master data, as well as the compliance with the controls related to the master data.
3. Question 3
Inaccurate data will cause more damage to an organization than excessive repetitive data. Inaccurate data may cause various negative impacts to business operations or processes, for example, inaccurate inventory data might result in ordering insufficient or excessive inventory in future, inaccurate invoice data will cause failure in recognizing and reporting revenue, or inaccurate customer data might cause the organization to deliver products or services to wrong customers. Excessive repetitive data might increase storage cost for an organization, but inaccurate data will lead to more significant impacts on the organization.
4. Question 4
In SAP-supported financial accounting, there are transactions that require special protection due to the associated high risks, including creating users, security audit configuration, role maintenance, or system trace. Some transaction codes are very critical and should not be assigned to anyone in the production system and should be locked. Also, there are some transaction codes which should only be assigned to Basis or Security team or to some superuser roles.
Deepa Kuppuswamy says
Hi Haitao,
Good answer for Question 4. I too agree with your point. When i was reading thru the SAP privileged transactions, i happened to read about critical privileges assigned to SAP defaults accounts which are SAP*, SAP_ALL, Earlywatch and DDIC. These are the highly privileged accounts, if many users are given access to these accounts then it becomes very difficult to maintain accountability in the event of unauthorized transactions. And also as a best practice it is recommend to lock these accounts when not in use and it should go through authorization and approval phase to unlock these accounts.