- Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
- Which department or person should play the key role in defining master data and assuring it’s quality?
- Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
- Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
Jing Jiang says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
The system will copy information needed from master data when a transaction is creating. A poor quality of master data can lead to unreliable information and flowed processes during the work.
To make sure master data in an ERP system are working well, proper master data management on data quality would be important. The methods could be :
1) Having a periodic review of the values in the critical fields.
2) Checking duplicated master data, including similar names and streets.
3) Identification of missing bills of materials.
4) Reconciliation the payment terms, transfer prices, internal and external information records and etc.
5) Checking the accuracy of the reconciliation accounts and making sure it is maintained consistently.
6) Segregating the jobs of performing transactions from the reviewer.
Khawlah Abdulaziz Alswailem says
Well Said, Jing
Adding to your points, enhancing the master data management include source identification, data collection, data transformation, normalization, rule administration, error detection and correction, data consolidation, data storage, data distribution, data classification and many others. An effective MDM can also help the enterprise to combine all the core data to a master file and provides a common point of reference.
Mengting Li says
Great post, Jing. I think master data is important because the benefits from the master data management paradigm increase as the number and diversity of organizational departments, worker roles, and computing applications expand. For master data management to function at its best, all personnel and departments must be taught how data is to be formatted, stored and accessed. Frequent, coordinated updates to the master data file are also essential.
Matthew J. Dampf says
Great post, Jing. This is definitely an ongoing process, not a simple one time data entry job that is soon to be forgotten, This needs to be reviewed, checked for duplicates, and this needs to be segregated from those performing transactions, though the person performing transactions would be one line of defense for finding duplicate or inaccurate data in the system.
Michelangelo C. Collura says
The period review seems silly, but it is very important that firms do use that technique and take it seriously. A concern would be staff getting comfortable because no issues occur in periodic reviews, so they make them less frequent or even stop them altogether. It is important to review data, especially when there are multiple users making new entries or having the ability to modify existing data.
Jing Jiang says
2. Which department or person should play the key role in defining master data and assuring it’s quality?
Master data is the highly integrated resource which will impact various business processes and different departments’ decisions. One person will not be possible to familiar with all business processes and understand what are the indeed needs of different departments. So the defining and assuring job should be a coordinated work across different departments. The data coordinators, analysts and senior management from any department which needs to use the data should seat together to define the master data and assure its quality.
Khawlah Abdulaziz Alswailem says
I totally agree with you, Jing
I would use Data steward term here which is a role within an organization responsible for utilizing an organization’s Data Governance processes to ensure the fitness of data elements – both the content and metadata. Data stewards have a specialist role that incorporates processes, policies, guidelines, and responsibilities for administering organizations’ entire data in compliance with policy and/or regulatory obligations. So, I think its necessary to define master data coordinator or data steward to be responsible for defining master data and assuring it’s quality.
Andres Galarza says
The organization I work for actually has an “Office of the Chief Data Officer” or OCDO. I believe in addition to managing data classification, management, and access, they’re key stakeholders for an application like SAP.
Parneet Toor says
You made good point Jing, I think it is always good to have two level authentication to manage quality of data. It can be possible that one department overlook a mistake and can come in notice by different person or department.
Lezlie Jiles says
Hi Jing,
I definitely agree with your response to this question. Due to the fact that the work is spread across various processes, and affect many parts of the organization, I believe a master data coordinator would be the best person(s) to define the master data and assuring its quality. As you stated, “One person will not be possible to familiar with all business processes and understand what are the indeed needs of different departments.”
Jing Jiang says
1. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
I would say both of the inaccurate data and excessive repetitive data are risky to a company.
The inaccurate data can be used by many departments and result in a large-scale negative impact. Any data analysis based on the inaccurate data would mislead the decision makers to make a bad decision on critical issues, and potentially, create compliance issues.
Repetitive data will also create risks since it will take unnecessary time and money. The unnecessary data will take up more space on the system, and further influence the operations of other applications. The company may need to buy new system or facilities to support the work. And employee will also need more efforts to find the data they really need among repetitive data. The work efficiency will definitely be reduced.
Both inaccurate data and excessive repetitive data will create risks to a company. They will both exert negative impacts on company’ reputation, and possibly, result in financial losses. If I must choose one, I think Inaccurate data would be riskier since it may be more difficult to be solved. In addition, the repetitive data would also result in inaccurate information, which will create similar negative results as inaccurate data.
Khawlah Abdulaziz Alswailem says
Jing,
You are right, both inaccurate data and excessive repetitive data will create risks to a company, but I think inaccurate data is a bigger risk than excessive, repetitive data as data are used in almost all the activities of companies and constitute and its the basis for decisions on operational and strategic levels. Inaccurate data can have significant negative impacts on the efficiency of an organization. Actually, inaccurate data can lead to financial loss, decrease customer satisfaction, and lower performance.
Parneet Toor says
Jing you are right but I think inaccurate data could affect company more. That being said, it affects compliance area. Data is more or less what comprise financial statements of the company. Organizations have a responsibility to file properly with regulatory bodies. Inaccurate data that is material to the financial statements can prove to be a compliance issue, which could result in lack of stakeholder trust, loose credit ability, etc.
Andres Galarza says
I agree, Parneet. In my own answer to this question, I also thought that inaccurate data is a more severe risk, than repetitive data. I’d add “tarnished image” to your list of consequences to an organization that doesn’t catch inaccurate data in their business processes.
Lezlie Jiles says
Hi Parneet,
Absolutely agree! Although I also agree with the group on this one that both data sets can be a risk, however inaccurate data is indeed the riskiest. In my post earlier I lead to the financial statements being accurate because of inaccurate information. However, compliance did not come to mind. So thanks for pointing that out!
Michelangelo C. Collura says
I would agree that inaccurate data seems like a greater risk, but perhaps we should think about the type of the firm, because that affects things as well. Inaccurate data would be dangerous to many firms, but repetitive data could spell disaster too – if a firm relies on automatic shipments as part of SLA’s with clients, they send the same things every month, so repetition is expected. If they accidentally repeat one order, that could be a waste of money, but inaccurate data would be perhaps easier to identify in such a situation, since the firm has a long history of identical shipments/transactions.
M. Sarush Faruqi says
Jing,
Great points about both inaccurate and repetitive data. I agree that repetitive data will definitely slow the system down by taking up space and reducing efficiency. It will take longer for the system to search through the data when employees perform a task. This is why normalizing data is important. Normalization will reduce ambiguity and the data can be used as intended. In terms of risk, inaccurate data can cause more damage to a company considering that data is used in decision making processes and reporting. From an ERP standpoint, data is used from one business process module to the next so there are dependencies within the processes themselves. If inaccurate information is used, the process might be still complete but results of the process may not be correct.
Khawlah Abdulaziz Alswailem says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
The master data is valuable information assets for an organization, it usually includes the core data like business partner master data; Item base data; Employee base data; and finance master data. To ensure the integration works well for all, the effective master data management (MDM) is required. The MDM is not a one-time affair of data take on for a short-term business, the data quality for master data also needs to be persistent in a long-term business.
An organization can also attempt to assure the master data is integrated well by:
– Implementing data governance.
– Developing master-data policy and models
– Designing infrastructure
– Generating and testing master-data
– Program maintenance
Parneet Toor says
I agree with you Khawlah, controls are necessary to provide assurance of integration of master data for all. Policies and procedures stipulate who creates master data, and how it is created. Controls are what ensures those policies and procedures are carried out correctly. Controls catch errors, and ensure the master data is accurate and complete for all parties who use it for their processes
Khawlah Abdulaziz Alswailem says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
In my opinion, inaccurate data is a more significant risk than excessive, repetitive data because of two reasons. First, the organization uses its data in making critical decisions like sales trends, pricing, advertising, inventory turnover, manufacturing rate, and much more are all ways in which data can be used in making decisions. If the data that is being used by a decision maker is not accurate, then the decision made could be incorrect leading to financial loss and depletion of resources. The second reason is that many tools can be used to filter and get rid of repetitive data, while there are few tools to detect and/or correct inaccurate data.
Parneet Toor says
Khawlah, you have valid points and I think inaccurate data is more of a risk to a company than excessive repetitive data. There are various tolls that can be used to check and get rid of repetitive data, whereas there are less tools to correct inaccurate data. Data are used in almost all the activities of companies and constitute the basis for decision on operational and strategic levels. Inaccurate data can have negative impacts on the performance of an organization. In fact, inaccurate data can lead to financial loss. Lower performance and customer satisfaction.
Michelangelo C. Collura says
That is a good point about detective/corrective controls for repetitive data. Since the data is understood to be duplicated by the software, it means less potential for wasted time and labor. Inaccurate data, as you say, can be detected, but it requires the software knowing what the right data is. For one-time orders for example, this may not be possible. If a firm relies on such orders in a big way, such as Amazon, that could be a disaster.
Parneet Toor says
Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
I think the sensitive transactions vary depending on the time of the year and characteristics of the business. However, transactions like FS00 (G/L Account Creation) for finance teams can be sensitive because it they aren’t controlled strictly,there is a risk of duplication or the creation of fake GL accounts. It could cause potentially fraud and cause mis allocations of funds.
Jing Jiang says
Actually, there are many transactions are sensitive, the time and characteristics of the business are the important elements to determine the sensitivity transaction. In my point of view, the nature of the transaction and the value of the transaction especially those will lead to significant financial influence would be the most sensitive transactions as well. The transactions influence financial results, which related to company’s profits and losses will most likely to be the place where existing human errors and frauds. Thus should have extra focus.
Andres Galarza says
This question made me think of an initiative we have going on at work that has similar concerns. Our business’s “digital crown jewels” is a project that aims to identify the most sensitive applications we run at the company. This approach would be equally important in identifying “SAP transaction crown jewels” i.e. the most sensitive transactions in SAP.
Yijiang Li says
Good thought, Parneet. Every company should have different types of sensitie transactions based on their business. However, General Ledger is always the essential data which the company should consider as sensitive, because it demonstrates the complete business process of an organization and relative records of money in and out.
Parneet Toor says
Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
The Master data in the ERP system is a highly integrated function and used by several different processes, which effect different sectors of a company. Due to its importance within the ERP system, controls must be incorporated and created so that the data can work properly within the different processes .If any any wrong data is present in the master data it can cause errors in all the applications that use master data. Company can assure the master data is incorporated with controls like SOD, testing master data continuously.
M. Sarush Faruqi says
Parneet,
Great points. Controls are an essential part in making sure the integration aspect work well. Master data should only be allowed for creation by authorized individuals and there should definitely be checks at pre-determined time intervals to make sure the master data is accurate. Master data is integrated in so many processes and it is essential that the data passed in the process is accurate. If any data issues arise, employees will spend more time fixing the errors then on their day to day work which can reduce efficiency. Employees should be monitoring the data in their day to day work to ensure there is nothing unusual with the data they work with.
M. Sarush Faruqi says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
From a company standpoint, I think both inaccurate data and excessive repetitive data can be risky but if I had to choose one, it would be inaccurate data. Data is used in every business function and business process. It is used in critical business decision making from management. It is used to provide customers better customer service and bring in more customers. It can help employees be more productive as they can spend more time on the core mission actions of the company rather than fix data related errors. Companies can stay in compliance especially in industries where trading is common and heavily governed. Inaccurate data can lead to reputational damage for a company if incorrect business decisions are made due to inaccurate data. Financial statements could be mis-stated resulting in a company not accurately displaying its financial position to investors. This could lead to lost revenue. In conclusion, there would be no visibility to anything if data was inaccurate. From business processes to business decisions to financial reporting, accurate data is what keeps everything together for a company.
Yijiang Li says
Nice explanation on inaccurate data, Sarush. For most companies, inaccurate data would cause more damage or loss on daily operation than excessive repetitive data. However, we can’t still ignore the negative effect of excessive repetitive data. Somethines, excessive repetitive data could increase the amount of computing and processing of computers and serves so that both of them would face a greater opportunity of downtime.
Mengting Li says
Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
Master data is a specific data set holding structured information about spare parts, raw materials and products within Enterprise Resource Planning (ERP) software. The data is held centrally and used across organizations. It represents the business objects which are agreed on and shared across the enterprise. In order to assure integration works well for all, the organization should put certain controls to make it works. Firstly, curating and managing master data is key to ensuring master data quality. Analysis and reporting are greatly dependent on the quality of an organization’s master data. Therefore, a good master data management is necessary. Also, segregation of duties is a good way to reduce the chance of fraud.
Mengting Li says
Which department or person should play the key role in defining master data and assuring it’s quality?
Master data is the basic data required to record the business transactions. It can cover relatively static reference data, transactional, unstructured, analytical, hierarchical and metadata. I considered IT department and financial department should play the key role in defining master data and assuring it’s quality. People in the financial department they response to entering data into the system. Entering wrong data could affect the quality of master data. Also, those data are managed and stored in the ERP system, therefore, to define the master data management problems should be IT department’s job.
M. Sarush Faruqi says
Mengting,
You make some great points in your post. If we look at this question from an ERP perspective, I think it is really difficult to pinpoint an exact department to define the master data and assure its quality. While I agree with you that Finance and IT departments should play a critical role in defining this data, other business functions who consume this data should also be involved. Business Functions such as Purchasing or Marketing should also be involved as they are a part of business processes such as Procure to Pay and Order to Cash. Master Data is something which should be tested repeatedly and these different functions should work together to do this in order to ensure the quality of the data is fulfilling their processes at an optimal level. IT should definitely be the line of support in the event this data is not available but the quality can only be improved if all of the business functions who use this data are involved.
Candace Nelson says
Hi Mengting, Sarush –
I agree with what both of your responses. I took it a step further and suggested that master data integrity should be governed by the same levels of executive management and the Board of Directors who oversee all critical transactions processed by a company. After all, if the information utilized in the preparation of significant transactions is flawed, upper management is likely to be held responsible. Additionally, since master data is used so widely in an organization, accountability needs to lie with those who are able to influence the nature and extent of controls devoted to its overall integrity in order for them to be adhered to.
Mengting Li says
Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
I considered inaccurate data is more risk than repetitive data. Firstly, inaccurate data might lose customers’ trust. It absolute damage your company’s reputation, then, leading to losing customers. It might also result in revenue decrease because poor data quality can have a heavy impact on an organization’s revenue. That one research shows that 75 percent of businesses are wasting 14 percent of revenue due to poor data quality.
Binju Gaire says
Nice explanation, Mengting. I also do consider inaccurate data is more risker than repetitive data. If the company provides inaccurate data then it will certainly loose the customers trust. This can be dangerous in the reputation of the company. Inaccurate data will mislead and create error in balance sheet and income statement as well.
Xiaomin Dong says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
In an organization, master data is very important for operational and analytical business decision-making. Based on the importance of master data, i think an organization should establish a master data governance team and operating model for that. This governance should be composed of representatives from each area of the business and be vested with the authority to define and approve policies governing the master data lifecycle to ensure data quality and usability, oversee the process workflows that touch master data, define and manage business rules for master data, inspect and monitor compliance with defined master data policies, and notify individuals when data errors or process faults are putting the quality or usability of the data at risk.
M. Sarush Faruqi says
Xiaomin,
Great points. A data governance is a great way to ensure that the definition and quality of the Master Data is and will be effective the various processes it is used in. If they team consists of members of the different functions, they can provide their input on how the data should be defined and quality should be maintained for the processes they are involved in. In addition, testing can be done by this team to make sure that the data is still usable. The team can also be involved in setting up validation techniques to make the data is accurate and complete.
Xiaomin Dong says
2. Which department or person should play the key role in defining master data and assuring it’s quality?
As far as I am concerned, the finance department should play the key role in defining master data and assuring its quality. Finance architecture is often complex due to multiple systems. Each system has got a dedicated repository. Therefore, it’s difficult to share data across a wide range of financial applications. Financial master data needs to be quickly and easily adapted to business changes at the fast pace of business. Financial MDM is essential not only for minor modifications in day to day business, when creating a new cost center or modifying an existing attribute for a legal entity, but also when making major changes such as mergers, acquisitions, reorganizations or regulatory changes.
Andres Galarza says
Xiaomin,
As others have said, it’s important to answer this question with the perspective of what the business does. For the example we’re building in class (Global Bike) it’s important to have bike subject matter experts involved in populating the data in the master data. A finance or technology person might be very familiar with “the books” or the technical aspects of SAP, but might not know anything about Global Bike’s key business processes. It needs to be a team effort.
Xiaomin Dong says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
I would take that inaccurate data is more of a risk to a company. First of all, both inaccurate and excessive repetitive data can negatively impact to a company, but the excessive repetitive data problem is easier to resolve compared to the inaccurate data. With inaccurate data, there are so many negative impacts associated with it. it is always not a good sign to have deficient inventory because if customer didn’t get what they wanted, they are most likely turning away to somewhere else. Inaccurate data may allow top managers to make bad and wrong decisions or delayed decisions. And also, it is wasting money and time to check it again and again and companies will have lost many opportunities to make better decisions.
Xiaomin Dong says
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
I believe a payment transaction should be considered as the most ‘Sensitive.’ Since that transaction is one that directly deals with payment, money, it needs to be treated very carefully. There must be segregation of duties placed to prevent any fraud from happening, and more authorization should oversee the transactions thoroughly. Money related things are always sensitive to all kinds of entities including companies, any sized organizations or even in a family. so, people should deal with money related fields very carefully. and segregation of duties must be applying to prevent any fraud and errors.
Binju Gaire says
I agree with you, Xiaomin. In addition to your points, payment transaction include information like customers’ credit card information, address and other PII. This further makes payment transaction the most sensitive and should have extra focus in an SAT (Sensitive Access to Transaction).
Qiyu Chen says
xiaomin, you are correct. Money related things are always sensitive to all kinds of entities including companies, any sized organizations or even in a family. so people should deal with money related fields very carefully. and segregation of duties must be applies to prevent any fraud and errors.
Binju Gaire says
Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
I believe inaccurate data is more of a risk to a company. Inaccurate data can result from various sources such as mistake in data entry, system problem, flawed process and so on. Inaccurate data can not be relied upon to operate business. Further, inaccurate data represents misinformation and mislead the applicable customers. The consequences of inaccurate data can lead discrepancies in financial reporting and reconciliations. Finally, inaccurate data will lead to audit findings as well.
Andres Galarza says
Q2: Which department or person should play the key role in defining master data and assuring its quality?
Not a technology person! This critical data should be jointly agreed upon by experts in the respective lines of business. The technologists should absolutely play a role in assuring quality of the data, but it’s the business process owners that need to define what data needs to be present.
Binju Gaire says
I agree with you, Andres. Defining master data and assuring its quality is a team work. Therefore, mangers and other employees from various departments should be involved in the process.
Andres Galarza says
Q3: Which is more of a risk to a company: inaccurate data or excessive repetitive data?
Inaccurate date, in my opinion. Excessive/repetitive data can create cost issues, but inaccurate data can snowball into a calamitous problem.
Matthew J. Dampf says
While I agree with your conclusion, I’d say that excessive repetitive data has the potential to be calamitous as well. I’m thinking about paying a vendor multiple times in the procure to pay process or shipping a single order multiple times in order to cash.
Lezlie Jiles says
1. Master data in an ERP system is highly integrated with various processes and affects many parts of the organization. How does an organization assure this integration works well for all?
Master data in an ERP system is highly integrated with various processes and affects many parts of the organization. An organization should have a well-defined process, as well as controls and training in place to ensure that the data captured is confirmed to organizational standards. Therefore, an organization can assure that these integrated processes work well by utilizing master data management which is an established set of rules to guarantee consistencies, correctness, and timelessness of several areas within enterprise data.
Yijiang Li says
I agree with you, Lezlie. Every department within an organization is responsible for master data, therefore, finding a method to let them work together is quite important. As you said, the management has this power and right to drive different departments to work on master data through establishing some policies and precedures.
Lezlie Jiles says
2. Which department or person should play the key role in defining master data and assuring it’s quality?
My initial response to this question was the accounting department. However, master data is utilized by a multitude of departments so my second thought was that the accounting department should be involved but not the key department. Accounting is indeed the most critical function, but master data is highly integrated and involves various processes which affect many parts of the organization. Therefore, I believe the IT department along with the input of management-level employees from those various departments, or a master data coordinator would be best to define the master data and assuring its quality
Matthew J. Dampf says
“the input of management-level employees from those various departments”
I’m with you on this part of it, Lezlie. Each department has more at stake than any other when it comes to the accuracy of the master data that their department needs. Therefore, I’d think that management in each department would be represented on a committee that governs master data.
Yijiang Li says
I agree with you, Lezlie. Finance and Accounting department should play the key role in defining the master data and assuring its quality, becasue thir staffs have necessary knowledge and relevant data. However, IT department should provdie technology support to different departments while they are using the master data and maintanance to master data on a monthly basis.
Lezlie Jiles says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
In my opinion, both inaccurate and repetitive data put a company at risk. However, I believe inaccurate data is the worst of the two. If you think about it organizations utilize data to make important decisions such as marketing, pricing, discounts, etc. If any of this information is incorrect it could put the company at risk for several reasons such as a financial loss. However, repetitive data is just as bad, as an example, this could affect the billing or receivables systems. Thereby providing inaccurate invoicing, etc.
Binju Gaire says
Well said, Lezlie. Even though both (inaccurate data or excessive repetitive data) are risky, inaccurate data seems to be more of risk because of the discrepancies it can cause in the financial reports.
Mengting Li says
I agree with you, Lezlie. I also believed that inaccurate data is the worst of the two. Based on inaccurate data, it gonna be hard for the organization to make a right decision. The wrong decision might lead to decrease revenue, lost customers, and damaged reputation.
Lezlie Jiles says
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
This is a tricky question because the nature of the transaction will determine its sensitivity. Therefore I would have to say if the transaction affected the master data in any way then it would be the most sensitive. Nevertheless, any/all transactions can be sensitive and should have extra focus in an SAT audit.
Matthew J. Dampf says
“3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain”
– Having a customer’s data loaded twice is a problem, but having the wrong address on file will cause orders to go to the wrong location.
– A customer receiving an order multiple times can be worse than them not receiving one at all.
– While paying a supplier multiple times is arguably worse than not paying them at all, it is possible, that the supplier will alert us that payment has been received multiple times for the same order.
I came into this question thinking the answer was definitely inaccurate data being worse, but repetitive data is a big risk as well when considering the possibilities. I do think it’s easier to correct excessive repetitive data with a skilled DBA, so I would say the inaccurate data is worse.
Qiyu Chen says
Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
Definition of Master Data Management is “a set of disciplines and processes for ensuring the accuracy, completeness, timeliness and consistency of the most important types (or domains) of reference data in the enterprise – across different applications, systems and databases, and across multiple business processes, functional areas, organizations, geographies and channels.”
Bottom line — some companies can use an ERP system as their cross-functional, enterprise-wide “single system”, and others have such a heterogenous IT environment that they require a dedicated MDM platform.
When an organization views data as an enterprise asset, it establishes an executive-level data governance committee that oversees data stewardship across the organization. Data governance exerts control over multiple business initiatives and technology implementations, to unify these through consistent data definitions and gain greater reuse for IT projects and business efforts.
Qiyu Chen says
Which department or person should play the key role in defining master data and assuring it’s quality?
Firstly we should know what is the master data. Master data is the core data that is essential to operations in a specific business or business unit. Master data may be about: customers, products, employees, materials, suppliers, and vendors, and it may also cover: sales, documents and aggregated sales. It is the primary focus of the Information Technology discipline of Master Data Management (MDM).
In my opinion, IT department should play the key role to ensure the data’s integrity and availability, and financial and marketing departments should make sure the data correct.
Michelangelo C. Collura says
I agree that the originators of the data – the data owners – should be responsible for identifying and defining master data. Since they have the best sense of the data and its importance, they can best understand how to assure quality too. To have this duty handled by the IT staff who would also address security and access, there is a risk that they would not properly approach the material. This would be a potential mess for the firm.
Qiyu Chen says
Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
I think inaccurate data is more of risk to a company because accurate data would not only mislead decison makers but also come up with compliance problems.
Accurate is important for decision making for the future. the inaccurate would misleading the decision makers. The wrong decision would cause wrong budgets, which may leads the company go bankrupt. The law require the organization should provide accurate information or face the severe penalty, for example, SOX require top management must individually certify the accuracy of financial information. On the other hand, the repetitive data is much easier to control as many tools can filter it.
Qiyu Chen says
Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
By doing some research online, I believe that transactions like FS00 (G/L Account Creation) that enable users to create, modify or delete GL accounts should have extra focus in SAT audit. These transactions should generally be restricted, and grant authorization access only for specific business needs. In addition, we should secure transaction FB01 (Post Document) or any manual journal posting transaction like FB50, F-02 etc . All these transaction codes can be affected and make negative impact on the business. Therefore, we should have extra focus in an SAT audit on these transactions that can be create and updated; we need to make sure the access are granted appropriately to the employees.
Edward Gudusky says
1. Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
Since information is copied from master data when creating business transactions, an organization will have solid internal controls in place to assure smooth integration of master data. An example of a control for this could be properly training master data creation personnel. Another could be segregating those who enter master data with those who perform transactions.
Edward Gudusky says
2. Which department or person should play the key role in defining master data and assuring it’s quality?
There should be a business process group that is tasked with maintaining master data. Within this group, there would ideally be a QA role. This business process group should fall under the Chief Information Officer.
Edward Gudusky says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
I would think inaccurate data is more of a risk, especially if the inaccurate data is classified as Master Data. Since Master Data is used in many transactions, this could be very impactful in a negative snowball type off fashion..
Edward Gudusky says
4. Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
I would say any transaction which involves creation of Master Data fields. Examples would be creating a new customer account or vendor account. With regards to these types of transactions, I would define sensitive as being very important to being correctly / accurately created as following transactions will depend on this data.
Candace Nelson says
Interesting Edward –
I was thinking that a transaction whereby master data could be changed would be a particularly sensitive one. Especially if audit trails were lacking. Having been trained to think like a bad guy (or woman), It seems like a rogue employee could change the bank account and route information on outgoing wire transfers and ACH payments and direct them to their own bank account, then change the information back without being detected. Of course, there are many other ways this fraud could be detected. However, persons who commit fraud usually start out small to see if they can get away with it. It is when they get greedy and increase the significance of the fraudulent activity that they often get caught!
Candace Nelson says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
Based on my knowledge and experience, I believe inaccurate data is more of a risk to a company than excessive, repetitive data. However, my response assumes that management is unaware of data inaccuracies. That being the case, financial and operational business results would be misleading to management and the public. Decisions made based on inaccurate data would likely lead to erroneous choices. Budgets based on prior year actual results would be flawed as per the principle “garbage in, garbage out.” Employees would be negatively impacted, but not nearly as much shareholders and other third parties, such as strategic alliances and business partners. Hence, the impacts if inaccurate data are far reaching. On the other hand, excessive, repetitive data would have more of an impact internally, e.g. increases in processing time, slower system response times. It is feasible that – if the excessive data is not normalized – it could lead to inaccuracies. I am interested in learning how others – and Professor Beaver – respond to this question.
Michelangelo C. Collura says
I feel that inaccurate data is a bigger risk overall, but thinking outside the box is useful, so I tried to imagine how repetitive data might be worse. In some companies, they may rely on repeat orders to clients, particularly service companies like insurance or such. The transaction is the same every month, every year, prices changing little or not at all. Duplicate orders in that case would be a huge waste of money if the software or staff don’t identify it, because they could be billing multiple times on a single policy. I would think that if their data was inaccurate, say for example if someone transferred customer data to a new ERP system and incorrectly entered a credit card number, it would be easy to identify the flaw because of the long history of previous orders. This may not be true because of regulation however, and I imagine it depends on the industry.
Candace Nelson says
2. Which department or person should play the key role in defining master data and assuring it’s quality?
Much like strategic planning – responsibility for defining and ensuring the quality of master data is shared between the business and IT, and these activities need to be governed by upper management and the Board of Director’s. This is my opinion, and it is based on the fact that master data maintained in relational databases is utilized by cross-functional processes throughout the enterprise, so there can be no single owner.
Having said that, I am aware that Data Integrity Positions have existed for many years. Hence, I googled this role and included the roles and responsibilities from a job description for a pharma Director, Data Integrity job description below:
• Identify, develop and implement best demonstrated Data Governance and Data Integrity practices and systems across multiple sites:
• Develop and implement and maintain data governance and data integrity policies and standards,
• Develop and implement global processes and systems and associated global procedures;
• Work cross functionally to deploy throughout the organization;
• Ensure continuous improvement of Data Integrity program.
• Responsible for leading and executing a consistent, harmonized, sustainable and effective Data Governance and Data Integrity Program that complies with regulatory requirements and company established requirements.
• Responsible for ensuring systems across all sites comply with data life cycle requirements from initial data creation/recording to archival and decommissioning. This includes, but is not limited to, data management (e.g., data creation, data processing, review, reporting), data security, data traceability process mapping, data backup/restore, electronic signature/electronic record linking and data audit trails.
• Responsible for leading, reviewing and approving data integrity assessments across all sites of new and existing systems including, but not limited to, manufacturing and laboratory systems to ensure compliance with regulatory requirements and company established requirements for data integrity.
• Responsible for leading, reviewing and approving data integrity periodic reviews and performing risk monitoring of implemented systems across all sites to ensure continued compliance.
• Responsible for defining, assembling and communicating data integrity metrics that yield on-going process improvements and optimization across all sites.
• Responsible for leading, reviewing and approving mitigation and remediation strategies across all sites when data integrity gaps are identified.
• Responsible for leading, reviewing and approving investigations across all sites and implementing corrective/preventative actions associated with data integrity events.
• Responsible for ensuring data integrity processes for automated systems across all sites are designed to align with computerized system development life cycle (SDLC) methodologies to ensure computerized systems meet regulatory requirements, company requirements and align with industry standards. In this respect, this role is also responsible for assisting with strengthening and modernizing existing computerized SDLC methodologies to align with data integrity processes.
• Responsible for performing continuous improvement of the Data Integrity Program ensuring the program is maintained current with industry standards and support key company initiatives.
• Maintains awareness of data integrity regulatory actions, current regulatory trends and their impact on existing systems and recommends internal process improvements.
• Provides leadership in interpreting regulations and guidelines associated with data integrity to ensure continued compliance of the Data Integrity Program.
• Seeks out and recommends to management opportunities for increased data integrity program efficiencies and operational improvement through modifications to current systems, implementation of new systems and more efficient use of established systems.
As is evident, this employee will not be solely responsible for data integrity; rather, they will play a key role in how data is managed cross-functionally throughout this particular organization.
https://www.glassdoor.com/job-listing/director-data-integrity-celgene-JV_IC1127008_KO0,23_KE24,31.htm?jl=2495396507
Yijiang Li says
3. Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
Usually, inaccurate data would affect the daily operation of an organization greatly. For instance, based on the company’s perspective, inaccurate data of size and weight of bicycle’s tire could affect the assembly line of Global Bike Inc and causes a defective product into the market. Based on customer’s perspective, inaccurate data of credit information or shipping address could cause the failures of payment or delivery. However, the negative effect of excessive repetitive data cannot be ignored. For example, excessive repetitive data could increase the burden of computers and servers and causes a higher probability of downtime.
Michelangelo C. Collura says
Master data in an ERP system is highly integrated with various processes and effects many parts of the organization. How does an organization assure this integration works well for all?
There is a process described in the week’s readings whereby a firm will validate and reconcile transferred/integrated data with data from the master file. Before all of this, the data is analyzed to determine how it fits into an old system, and how it’ll fit into the new. Definitions are also reviewed to note any differences. Data is then transformed or modified in order to fit into the new framework. Finally, the data is actually loaded and reconciled with the original to ensure accuracy. This is a very precise method, and mistakes can occur, but with proper application, risk is minimized. It is also a standardized method, so firms can have peace of mind in integrations regardless of industry or complexity.
Michelangelo C. Collura says
Which department or person should play the key role in defining master data and assuring it’s quality?
I think the owner of the data would be best positioned to define and assure quality. In payroll, this might mean a payroll manager. In accounts payable, this might be a clerk in the department. They understand the data because it is what their jobs focus on, so they know how to identify important pieces and reconcile when transferring to new systems.
Michelangelo C. Collura says
Which is more of a risk to a company: inaccurate data or excessive repetitive data? Explain
Inaccurate data seems like a bigger risk because it compromises any processes relying on that data. Repetitive data can also compromise processes, particularly by causing delays or increased human error, but inaccurate data seems worse because there is no way for users to identify correct information. Unless the user is a data owner and realizes the problem, the firm will be compromised until the inaccuracy is fixed. This could mean orders not shipping because addresses are wrong, or staff being paid incorrect amounts. Productivity would take a hit in either case. If data is repeated, it would potentially slow down order processing or lead to staff accidentally filling orders multiple times, but there would be less chance of total chaos in my mind.
Michelangelo C. Collura says
Which transaction do you believe is the most ‘Sensitive’ and therefore should have extra focus in an SAT (Sensitive Access to Transaction) audit? Explain
I would think posting of sales transactions and the original sales order itself would be particularly worth watching and segregating because the possibility of fraud is so high. If a firm wishes to inflate their books to make it seem like the company is doing much better than it is, access should be restricted as much as possible to that part of the ERP system, in order to avoid giving the opportunity for such entries. This would seem especially true if sales staff are pressured to make quotas – they will be strongly incentivized to make up false orders to meet such quotas.