- Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
- Which department or person should play the key role in defining master data and assuring it’s quality?
- Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
- Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
Robert Conard says
1. Centralizing the master data to be available to all would be a good start to ensure its best use. How the various departments derive data will determine how effective their utilization is. Availability is key here so that every business facet can fully implement data it needs,within the bounds of its allowed access.
Robert Conard says
2. I think first and foremost the data owner decides the availability of its information. Information/network security personnel should maintain the integrity of that data with controls around its usage. The business (management) decides what data is useful within a master set.
Robert Conard says
3. Inaccurate data. If that excessive information is not defined as repetitive, that can be just as destructive as inaccurate information. A sale logged twice will yield inaccurate reported revenues. Data integrity is by and far the most important thing. There are external consequences of inaccurate data, but a company making decisions on wrong data will make wrong choices.
Heiang Cheung says
Hey Robert,
I thought the same thing about inaccurate data and repetitive data is kind of the same thing because if you have two of the same entries than the data is inaccurate. I would go with inaccurate data as being the most risk because repetitive data could be easier to spot when you’re doing reconciliations.
Scott Radaszkiewicz says
I kind of thought the same way originally. But I took a view of excessive or repetitive as something you can identify and eliminate from your decision making. If data is duplicated, sometimes you can account for that. Incorrect data, is much harder to find.
Heiang Cheung says
1. Master data in an ERP system is highly integrated with various processes and affects many parts of the organization. How does an organization assure this integration works well for all?
Master data is the core data that is essential to operations in a specific business or business unit.
An organization needs to make sure the data is accurate to assure the integration works well for all because if it is not accurate than it is pointless. Also making sure people have the right access to the data would be critical because if people have access that is not supposed to have access than it could be a risk of fraud.
Heiang Cheung says
2. Which department or person should play the key role in defining master data and assuring its quality?
I think the accounting department should play a key role in defining master data and assuring its quality because everything runs through the accounting department one way or another. Also, it’s accountants that do all the reconciliations for GL accounts and bank statements so assuring quality is part of the job of an accountant.
Mahugnon B. Sohou says
I have to disagree with your answer to question 2. I don’t think that the person who should play this key role should be from the accounting department.. I think like I said in my post that the person in charge of that should be a dedicated user, a database administrator whose sole purpose is to control the database and assure the quality of the data. If people from different departments had access to the data from their department those employees could not work independently, and they could still make unauthorized changes.
Reply
Heiang Cheung says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
I actually think it’s basically the same in a way because if it’s repetitive data than its basically inaccurate data. So, I believe that inaccurate data would be more risk to a company because the business use the data to run their business and make decisions base on it. If you have inaccurate data you could have inaccurate financial that could be reported to the public, which would a big risk because you would have to deal with the SEC and your company stock price will definitely take a dive. Repetitive data you could kind of find out the data is repetitive and fix it when you’re doing multiple reconciliations.
Heiang Cheung says
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
I think I would have to say any access to transactions that could modify GL accounts would be the most sensitive because if they are not controlled strictly, there is a risk of duplication of GL account which could cause misallocations and potential fraud. The link below gives some of the transaction codes in SAP that are sensitive.
https://www.winshuttle.com/blog/sensitive-transaction-codes-sap-year-end-audit/
Xiaozhou Yu says
I agree that GL account is sensitive within the SAP system, since it consists all business transactions, and have a clear view of external accounting. it can be used as a tool to analyze all accounts and transactions.
Scott Radaszkiewicz says
Question 2: Which department or person should play the key role in defining master data and assuring it’s quality?
Thought about this for a bit. I think that the answer to this is come C level person, such as the CFO or CIO, most likely CIO. But this should be a collaboration to ensure that all facets of the organization are met. Master data is key and there must be one person in charge to oversee it all. A data steward is a common position in large organizations. This person is key in ensuring that master data meets the organization’s needs and that procedures and policies are in place to ensure its quality and integrity.
Mengqiao Liu says
Good point on data steward. After researching, I realized that a data steward serves as the data governance subject matter expert to the business unit they represent. They are trained and enabled to lead the execution of data quality initiatives, including remediation where it is needed.
Scott Radaszkiewicz says
Question 3: Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
I believe that inaccurate data is the worse risk to a company. Excessive data, while cumbersome, can be evaluated and the redundancy sifted out and the true data analysed. But if the data is incorrect, there is no accounting for that error. You will make business decisions based off of data that is not valid, and this could be a very dangerous thing.
Scott Radaszkiewicz says
Question 4: Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
So, this one took some research. In a blanket statement, I believe any transactions that are financial transactions are the most sensitive. But in my research there are two SAP Transactions that I think are extremely important to audit. FS01 Create Master Record and FS02 Change Master Record. These two transactions could have a wide sweeping effect on the system and permit some serious fraud to take place. Creating or modifying Master Records without authorization is a process that we would definitely want to audit and ensure that nothing irregular is going on.
Akiyah Baugh says
Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
Master data in an ERP system should have controls in place like segregation of duties to ensure that employees cannot commit fraud. Business processes rules, should also be in place to ensure that master data, customer information, and transactions to name a few, are accurate.
An organization can run reports on system access, transactions, data as well as complete regular maintenance to ensure that the ERP system’s controls are accurate and efficient.
Scott Radaszkiewicz says
I agree Akiyah, no matter who is in charge of the Master Data, proper procedures and controls are essential to the integrity of the data. Seems like we always go back to Segregation of Duties and proper controls as the solution to safeguard any company asset.
Akiyah Baugh says
Which department or person should play the key role in defining master data and assuring it’s quality?
I believe the accounting/finance departments should work with the business departments to define and maintain the master data. The accounting department should be responsible for entering and maintaining the master data, however they need to closely monitor the data from other departments such as purchasing, parts, sales, etc…
Akiyah Baugh says
Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
I believe inaccurate data is more of a risk to a company. Controls can be put in place to control duplicate entries / unique constraints. A control can be setup for inaccurate data as well, but it would be more difficult to control than duplicate entries in my opinion.
Akiyah Baugh says
Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
I believe the most “sensitive” transactions are any transactions that involves money coming into the company or leaving the company. Extra focus should given to these areas to ensure that all monies reach their proper designation internally as well as ensuring that no monies are fraudulently leaving the company. Controls should be put in place to safeguard sensitive data.
Pascal Allison says
As cash is difficult to recover, it is not a good exposure for a company to have with cash leaving or coming in the company with less or no controls. If cash leaves the company and the company wants to recover that cash, good money might go chasing bad money. Meaning, the lost cash will not be recovered thereby increasing the lost.
Notwithstanding, there are other transactions that affect decision making concerning customers, investment, marketing, etc. I believe they could equally affect the company therefore they should be treated with some sensitivity.
Folake Stella Alabede says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
An organization can assure that integration of Master data in an ERP system with various processes works well by working on various details like data quality. Data quality would ensure that data coming from different and multiple sources are consistent, error free and non-redundant; and this can be achieved by applying standard rules and business processes to build a single view of the data across the many parts of the organization before distribution.
Folake Stella Alabede says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
I would think that inaccurate data would pose more of a risk than excessive repetitive data, AS LONG AS the excessive data is accurate and not repetitive. But if the excessive data is inaccurate as well, then it’s a major problem, as you many copies or occurrences of this inaccurate data everywhere.
Inaccurate data translates to garbage in garbage out, which means since the input is wrong, the output of that data would definitely be wrong as well, and this would translate/interpret as high risk
If you have excessive “and accurate” data, its low/medium risk; its just data taking up space, probably being bothersome as these data might be popping us everywhere (excessive).
But when data is repetitive (example, making multiple vendor payment) it can be dangerous to an organization.
This is one of the advantages of master data as it is centrally stored and is processed to eliminate data redundancy
Folake Stella Alabede says
2. Which department or person should play the key role in defining master data and assuring it’s quality?
This reminds me of one of the questions from last week about the key competencies the person responsible in a company for security / a given process needs to have.
Whoever would play the key role in defining master data and assuring its quality should be very knowledgeable about the processes and functions of the organization. That being said, I don’t believe this should be a one man job, it should be people with expert knowledge about different and applicable business process (different departments) coming together to decide on important and relevant data/information needed to create/define the master data.
To assure the quality of the master data, there should at least be a data manager (usually someone from the IT department) to implement (already defined master data) and subsequently manage the technical aspect and ensure access protection.
Scott Radaszkiewicz says
Nice way to tie the weeks together Folake. I agree, the person should possess those traits, and they should understand the entire organization, the organizations mission. This, to me, should be some high level C type people. While they might not be doing all of the work, they should be the one in charge and held accountable for this task.
Mengqiao Liu says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
The organization needs Master data management (MDM) to assure this integration works well for all. Mater data management is a technology-enabled business discipline in which business and IT organizations work together to ensure uniformity, accuracy, stewardship, semantic consistency, and accountability of an organization’s official, shared master data assets for enterprise resource planning (ERP) projects.
Mengqiao Liu says
2. Which department or person should play the key role in defining master data and assuring it’s quality?
MDM is an important business initiative because it works to eliminate redundant and inconsistent versions of the same data in the organization and across data domain types. Master data domain types can vary across projects, the typical scope of master data are: customer data, supplier related data, employee data, warehouse, and chart of accounts, etc. The person should be an IT-related expert, who has rich experience and deeply aware those data above.
Mahugnon B. Sohou says
I agree with your answer. I think only a dedicated database administrator should play the key role of defining master data and assuring data quality., This avoids any segregation of duties issues and that an employee from another department does not misuse the data.
Mengqiao Liu says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
In my opinion, inaccurate data is more of a risk. Excessive repetitive data can be corrected by normalization, which is for database that minimizes redundancy. For inaccurate data, the risk can be prevented more easily than be detected or corrected, by the technology of input validation. However, it is hard to detect or correct if there was a human error or fraud.
Mengqiao Liu says
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
A sensitive or critical access risk is where the direct assignment of an ability to a backend user constitutes a risk. Objective of this risk is to help establish that access is restricted to the appropriate individuals. Maintenance of accounting periods should be segregated from the posting of financial transactions in the wrong period. The receipt/maintenance of inventory should be segregated from order and invoicing activities. Reconciling and releasing blocked vendor invoices should be segregated from daily processing and posting activities. Maintenance of contracts and terms should be segregated from payment and billing document changes.
Reference:
https://chapters.theiia.org/los-angeles/Events/Documents/IIA%20%20Los%20Angeles%20%20SAP%20Security%20Presentation%20.pdf
Xiaozhou Yu says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
Master data ERP are designed as a solution to leverage and manage all the data being generated to improve business process outcomes. ERP systems demand a common definition of critical enterprise data so that common business processes can be implemented.
It is important to ensure the data included is well defined and organized, have effective connection with the ERP models to support business operations. Also, the access to the data should under decent control to maintain the information confidentiality, integrity and availability.
Xiaozhou Yu says
2. Which department or person should play the key role in defining master data and assuring it’s quality?
There are couple roles should be involved. Data governance, administrators and data stewards.
Data governance users dictate to data stewards how data should be managed, including the processes for doing so, and then hold the data stewards accountable to following those requirements. Data governance users also dictate to administrators what to create during the implementation of the master data and ERP integration, especially from a data matching and quality perspective. Data Stewards can boot on the ground individuals responsible for fixing, cleaning and managing the data directly within the solution.
Mahugnon B. Sohou says
I couldn’t agree more. In my post I mentioned the data administrator as the person who should take on the key role of defining master data and assuring data quality, however you went into deeper details and I think there couldn’t be any answer more correct than that. Great post. Thanks for sharing your thoughts.
Xiaozhou Yu says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
I think inaccurate data is riskier to organization. It is true that duplicate data have negative impacts on business in terms of cost of management, inefficiency and poor productivity, but organization can certainly apply data clean and remove all excessive repetitive data within the system to keep data management clear and concise.
Inaccurate data, on the other hand, have a direct impact on business operations and processes and will not be noticed until the results go wrong. Then we will need to track back to detect problems and fix errors. This will increase the costs of management as well ,and the time used to fix those issues slow down the overall business efficiency.
Xiaozhou Yu says
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
Within the SAP, I think general ledger accounts are more sensitive. General Ledger (G/L) accounts are used to provide a picture of external accounting and accounts and to record all the business transactions. Ana d in SAP there is restrict authorization to management the account access.
Folake Stella Alabede says
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
Most transaction that deal with cash is sensitive in one way or the other. In addition transactions that have to do with master data are sensitive as well, (e.g Users with access to perform customer master data changes). Access to SAP functions that enable users to create, modify or delete General Ledger accounts should be restricted, and access should only be granted (and logged if possible) for specific business transactions as needed.
James T. Foggie says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
(1) the master data must be accurate, relatively stable, and have the ability to be used repeatedly.
(2) the master data should be centrally stored (shared across application modules)
(3) data must eliminate data redundancy
(4) the master data structure must be designed in a way that provides quick, efficient to support transactions and events in the ERP system
Nauman Shah says
Very elaborate response James. I agree with all 4 strategies mentioned by you, however I would like to add an additional layer of protection which is automated or manual controls overs the completeness and accuracy of the data when it flows from source to destination systems. Users of the data need to have assurance, that the data they are getting from other systems/modules is complete and accurate and hence reliable.
James T. Foggie says
2 .Which department or person should play the key role in defining master data and assuring it’s quality?
A person with amble authority to force all stockholders to come together to efficiently define the master data.
The master data should be maintained in an independent manner. Master data fields should also be evaluated on regular intervals to ensure the relevance of fields and data in the master data repository.
James T. Foggie says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
Both are not good, but if I have to select one, I say inaccurate data is more of a risk. I say this because excessive repetitive data, although will lead to performance challenges, if accurate is not as risky as having inaccurate data. Inaccurate data leads to poor decisions at all levels of an organization.
Pascal Allison says
I agree with the inaccurate data level of risk compared to excessive data. Inaccurate data could be lead to misrepresentation which could lead to lot of issues (legal, regulatory, investment, etc.).
James T. Foggie says
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an
SAT (Sensitive Access to Transaction) audit? Explain
Restrict access to SAP functions that modify GL Accounts. Access to SAP functions that enable users to create, modify or delete GL accounts should generally be restricted, and access only granted for very specific business needs.
Examples of these type of t-codes include:
◾FS01 Create Master Record
◾FS02 Change Master Record
◾FS05 Block Master Record
◾FS06 Mark Master Record for Deletion
◾FSS1 Create Master Record in Company
https://www.winshuttle.com/blog/sensitive-transaction-codes-sap-year-end-audit/
Tamekia P. says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
The organization assure that this integration works by customizing the master data to ensure that it is applicable to the needs of the organization. When setting up the ERP, the organization would discuss what fields were necessary and prioritize the use so that the master data would be configured to gain the most benefit of the organization. The major stakeholders within the organization would come together to determine the best approach. Depending on the fields requested, other modules within the ERP may need to append the master data with additional fields but the master data would be a central point of consistency. For example, the chart of accounts would need to be defined so that there was one listing of accounts that could be used and people could not create their own accounts.
Tamekia P. says
2. Which department or person should play the key role in defining master data and assuring it’s quality?
The Finance department should play key role in defining the master data as the especially if end goal of the ERP system is to produce financials for the organization. Finance would need to approve changes to the master data and ensure the groupings of accounts are appropriate to ensure they are reconciled appropriately.
Tamekia P. says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
Inaccurate data is a bigger risk to a company than excessive repetitive data. If you know the data that is repetitive, you may have to spend time cleaning up the data to remove the repeats. However, inaccurate data requires more time because you would have to determine which data is inaccurate vs accurate.
Derrick A. Gyamfi says
Tamekia,
I agree that inaccurate data is a bigger risk to an organization. Problems with data quality are practically inevitable. To err is human, and nearly everyone makes the occasional mistake entering customer information into databases. It’s essential that companies take action to protect their data – if not, the consequences can be staggering.
Tamekia P. says
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
Transactions concerning cash should be the most sensitive as it is typically hard to recover cash when it has left the organization.
Pascal Allison says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the
organization. How does an organization assure this integration works well for all?
The integration of data is vital to an organization, as such it should be managed properly to ensure the integration of master data is accurate and done properly. The data much be processed from its origin with the involvement of the responsible department or primary users. This enables the proper data availability, then comes the integrity of the data. Information security officer or system administrator (security) should ensure the integrity of the data. That is the data is available to the proper users and is available only when is needed. If need be, a senior management individual can seat over the master data management to ensure the requisite processes and procedures is instituted. For the most part, the data must be centralized for ease of monitoring and controls taking into accounts segregation of duties.
2. Which department or person should play the key role in defining master data and assuring its quality?
Because of control and security, the definition of a master data should begin with the owner of the data because each user(department) define data differently. The individual directly involved in using the data and quality control should begin the definition of a master data. The assurance of the data can be done by a higher level individual(management) in conjunction with those that the data affect or the data owner.
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
All data are of equal risk to a company, but if I must underline one, I will underline inaccurate data as of higher risk to a company compare to excessive data. Inaccurate data could lead to lots of problems for a company. One major thing would be decision making which could affect customer database, reputation, decrease revenue, etc. Management could divert supplies and investment based on a decision made from inaccurate data. This could affect investment and return.
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an
SAT (Sensitive Access to Transaction) audit? Explain
All transaction is sensitive and should have all the necessary controls and focus in an SAT. Transactions have data; data could be used for decision making. Besides, transaction affect the financial statements, if they are not a point of focus or controlled, the transaction could be incorrect, duplicated, manipulated, etc. which could lead to a violation of a legal obligation, regulatory law, poor decision making, etc.
Derrick A. Gyamfi says
An ERP system’s primary purpose is to automate business processes in order to make the business more efficient and provide better visibility into those processes. But in order to get the benefits of ERP systems, businesses need to to integrate their ERP system with their other enterprise systems. This can often be difficult; the challenges of ERP integration involve both the age of the systems, the architecture of these systems, and the need to integrate new applications and systems into the original ERP. An organization can ensure this integration works well by utilizing incremental adoption of all data.
Derrick A. Gyamfi says
Master data is the consistent and uniform set of identifiers and extended attributes that describes the core entities of the enterprise including customers, prospects, citizens, suppliers, sites, hierarchies and chart of accounts. I think the IT and Business Development department should play a key role in defining master data and assuring it’s quality.
Derrick A. Gyamfi says
I think inaccurate data is a more of a risk to a company than excessive repetitive data. This is because although excessive repetitive data lags or slows downs business processes, inaccurate data can be extremely harmful to an organization. This includes damages to an organization’s bottom line, resulting in customer churn, a damaged reputation or revenue loss.
Derrick A. Gyamfi says
In my opinion, F110 is the most ‘Sensitive’ transaction and should have an extra focus in an SAT audit. F110 is a T-code that can be executed by users based of their SAP authorizations; this code is known as “Payment Run/Automatic Payment Transactions”. F110 is used for processing the payments and printing checks automatically. This code is also in the financial accounting functional area and can be subject to high risk fraud and errors.
Nathan A. Van Cleave says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
Master data should be treated as a “single version of truth” for all systems. If the master data itself become corrupted or there are integrity issues, then the systems that rely on that data will experience integrity and integration issues.
To help ensure the master data remains true, there should be clearly defined and implemented controls around who has access to the data and who has the ability to move, change, or update the data.
Nathan A. Van Cleave says
2. Which department or person should play the key role in defining master data and assuring it’s quality?
The IT department should have an individual or group to ensure the master data’s quality. These groups should understand the structure of data as well as the source and downstream systems that consume it and the integration needed for various systems to be able to feed and consume the data. They will also need to understand the cycles in which the data is updated or moved, and there must be clearly defined roles, responsibilities, and controls in place around those updates, and movements.
Nathan A. Van Cleave says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
It is dependent on the company’s risk appetite and tolerance. I would lean toward inaccurate data being more of a risk as accurate data is imperative for financial and other systems to do what they are suppose to do – provide reliable information to help make or influence business decisions.
Excessive, repetitive data can also prevent risk as it can cause burdensome system processing or require significant manual intervention to cleanse the data.
As mentioned, it does depend on what the company’s resources and capabilities are and what the data is being used for that would determine the risk level the company would view it.
Mahugnon B. Sohou says
Sorry I accidentally shared my posts in the 401 section earlier instead of the 701 online section
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
Once we implement a new ERP system, we need a master database to centralize key customer and vendor data. A Master data is data that is centrally stored and processed to eliminate redundancy, which can create errors in business processes if there are for instance two different versions of an entry that are supposed to be similar. Then these key data can then be used in all modules that use the ERP system assuring that various parts of the organization are integrated and work well. To assure an integration accross all parts of the ERP system companies should also often perform reviews of the information present in the master data set.
Mahugnon B. Sohou says
Sorry I accidentally shared my posts in the 401 section earlier instead of the 701 online section
2. Which department or person should play a key role in defining master data and assuring its quality?
I think the department or person who should play a key role in defining master data and assuring its quality is the database administrator. This the best person for this task to assure that there ie segregation of duty to avoid that for instance someone cannot create false purchases and approving them etc… This minimizes the threat of someone misusing the data.
Mahugnon B. Sohou says
Sorry I accidentally shared my posts in the 401 section earlier instead of the 701 online section
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
Inaccurate data and excessive repetitive data both have their own unique threats to data integrity and can both lead to devastating consequences. However inaccurate data can create errors in processes and lead to incorrect decisions being made due to inaccurate data being used. Whereas excessive repetitive data can make small discrepancies in data have significant financial consequences as multiple versions of the same order could be processes at the same time, which in the long run could also lead to issues with customer relationship. There fore excessive repetitive data represents a bigger risk to a company, as it has the potential of causing a bigger loss.
Mahugnon B. Sohou says
Sorry I accidentally shared my posts in the 401 section earlier instead of the 701 online section
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
I think finance and accounting related transactions are more sensitive and should have extra focus in an SAT audit. The reason for that being that a company’s information present in these documents are financial information and could affect the entire organization as they directly relate to the company’s financial well being. In the event that these information falls int the wrong hand, this would leave the door opened for fraudulent transactions
Nauman Shah says
1 – Master data in an ERP should be centralized, and access should be provided to those departments that need it for their business processes/Accounting. Strong access controls need to be in place for Master data, to ensure its confidentiality, integrity and availability. In case of an automated feed of master data from source to destination system, controls needs to be in place to ensure completeness and accuracy of the data.
Nauman Shah says
2 – The business process owners should be responsible for defining the master data and assuring its quality. They are on the front line of the business and have detailed knowledge of the Master data and its elements. For instance, the procurement department that deals with vendors would have the knowledge of vendor master data and material master data, so they would be able to define it in the system. As far, ensuring quality, Database administrators or SAP basis team would be responsible for safeguarding the data from unauthorized access and changes.
Nauman Shah says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
Inaccurate data by far, poses greater risk to a company, as it can cause inaccurate reporting/forecasting which might mislead the stakeholders. Inaccurate data basically would render any business process that is dependent on it to be inaccurate. Excessive/repititave data only causes inefficiencies/redundancy, which is not good from a process and storage optimization stand point.
Nauman Shah says
4 – Any tcode that provides edit access to the tables in TB would be most sensitive and therefore important to protect. With unauthorized access to the database tables, integrity of data would be at risk.