Information Disclosure and the Diffusion of Information Security Attacks Mitra & Ransbotham–Siddharth Bhattacharya
The paper talks about an ongoing debate in the research community about limited versus full disclosure about vulnerabilities that are often attacked on by third parties.Proponents of limited disclosure argue that it ensures that vendors and targets receive reasonable time to develop and deploy patches and countermeasures before systems are attacked,whereas the alternative full disclosure creates a window of opportunity for attackers before patches and countermeasures are deployed.On the other hand, full disclosure provides incentives to vendors to create better quality software and notifies security professionals so that they can install countermeasures immediately.Thus the paper wants to answer the questions:does full disclosure speed the diffusion of attacks corresponding to the vulnerability through the population of target systems?does full disclosure increase the risk that a firm is attacked for the first time on any specific day after the vulnerability is reported, given that it has not been attacked prior to that day?does full disclosure increase the number of target firms affected by attacks based on the vulnerability?does full disclosure increase the volume of attacks based on the vulnerability?.
The authors use measures developed in earlier literature to first form analytical estimates of various measures such as Na(t) cumulative number of attacked systems at time t,cumulative number of protected systems at time t ie Np(t) etc and then uses these for development of their hypothesis.Next,the authors augment this analytical analysis with two main data sources :a proprietary database of alerts generated from intrusion detection systems (IDSs) installed in client firms of an MSSP during 2006 and 2007 and second combine this panel data set with dates from the National Vulnerability Database (NVD) to obtain detailed characteristics of the vulnerabilities.
The authors use a series of models:a non-linear regression model,a cox model and finally a poisson model to corroborate all their hypotheses,all of which are supported.Results indicate that full disclosure accelerates the diffusion of attacks corresponding to a vulnerability. Full disclosure also increases the risk of first attack on any specific day after the vulnerability is reported.Full disclosure also increases the penetration of attacks within the population of target systems.Additionally, although the aggregate volume of attacks remains unaffected by full disclosure, attack activity shifts earlier in the life cycle of a vulnerability, thereby reducing its effective life span but intensifying activity while active.The paper makes several contributions.Practically, quantifying the net effect of information disclosure on the diffusion of attacks informs the continuing debate about the optimal disclosure of information security vulnerabilities.It also adds depth to the debate about limited versus full disclosure and uncover a potential negative effect of full disclosure.Finally it adds to the diffusion of innovation literature by focusing on the diffusion of a societally undesirable innovation versus positive innovations studied before.
Wang, J., Gupta, M., & Rao, H. R. (2015). Insider threats in a financial institution: Analysis of attack-proneness of information systems applications. MIS quarterly, 39(1).
Insider threats to information security are considered to be a critical issue for organizations. However, research on quantification of the risk of insider threats on information assets is sparse. Extending from the original context of explaining predatory crimes in the physical environment, this paper synthesizes routine activity theory (RAT) with survival modeling, and makes the following main contributions: In order to understand what causes applications to be exposed to insider threats, the study conceptualizes and operationalizes the main constructs of RAT (value, inertia, visibility, accessibility (VIVA), and absence of guardians) in the domain of information systems security.
To quantify the risk that an application will experience unauthorized access attempts, the authors applied survival modeling. With log data of an enterprise single sign-on (ESSO) system and information regarding integrated IS applications from a regional financial institution, the authors then focus on the investigate the main constructs of RAT. The survival analysis results show that 1) the business value of an application (BVM) increased the risk of an application experiencing unauthorized attempts; 2) Control strength (CSTR) decreased the risk of an application experiencing unauthorized attempts; 3) Access prevalence level (log (APL)) increased the risk of an application experiencing unauthorized attempts; 4) OLIM increased the risk of an application experiencing unauthorized attempts; 5) data protection level (DPL) decreased the risk of an application experiencing unauthorized attempts. Basically, all the five hypotheses are supported by the empirical results.
Theoretically, this paper introduces the measurements for managing risks against insider threats within an organization. They conceptualized and operationalized the risk of insider threats associated with information assets, thus providing a foundation for future research in risk management of digital assets. Practically, this study suggested that practice of risk management of IS applications should be adapted to the organizational context, and account for users’ behavioral patterns.
Pang and Tanriverdi (2017) Security Breaches in the U.S. Federal Government
There has been limited research on the mitigation mechanisms of security vulnerabilities with actual breach incident data at the organization level. This paper studies the effectiveness of three organizational IT risks mitigation mechanisms: modernization of legacy IT systems, institution of effective IT GRC, and migration of legacy IT system to the cloud. The hypotheses are developed based on criminology theories such as rational choice theory and crime opportunity theory. (1) Federal agencies that spend higher percentages of their IT budgets on the maintenance of legacy IT systems are likely to experience more security incidents than ones that spend higher percentages of their IT budgets on IT modernization and new IT development; (2a) IT GRC effectiveness reduces security incidents in federal agencies; (2b) IT GRC effectiveness substitutes IT modernization in reducing security incidents; (3) Federal agencies that migrate their IT systems more to the cloud are likely to experience fewer security incidents.
The unit of analysis is a U.S. federal agency. Data on security incidents in the federal agencies is obtained by FIMSA report. Security breaches to the federal agencies in 2005-2016 is obtained from an independent source PRC. Data on the IT investment profiles of federal agencies is collected from the Federal IT Dashboard.
The study finds that (1) A 1%-point increase in the share of IT modernization in the IT budget is associated with a 5% decrease in security incidents. (2) the institution of effective IT GRC mechanisms significantly reduces the security incidents. (3) a negative interaction effect between IT modernization and IT GRC. The findings complement the extant IS security literature on the technical mitigating mechanisms by assessing the effectiveness of more managerially actionable mechanisms.
Angst, Corey, et al. “When Do IT Security Investments Matter? Accounting for the Influence of Institutional Factors in the Context of Healthcare Data Breaches.” MIS Quarterly, vol. 41, no. 3, 2017, pp. 893–916.
The authors explore an interesting question about how hospital factors determine the extent to which they are symbolic or substantive adopters of IT specific practices. Institutional theory distinguishes between symbolic and substantive adoption in order to account for the degree to which the activities of a firm are accurately reflected in the signals they communicate to relevant stakeholders. Substantive adoption represents one extreme, where signals are accurate representations of adopted practices and are tightly integrated with the organization’s core operation; where symbolic adoption is intended to enhance a firm’s external validation or legitimacy rather than achieve a specific technical benefit. Using data from three different sources, they create a panel of more than 5,000 U.S. hospitals and 938 breaches over 8 years. They use a growth mixture model approach to model the heterogeneity in likelihood of breach and they apply a two class solution in which hospitals that (1) belong to smaller health systems, (2) are older, (3) smaller in size, (4) for-profit, (5) nonacademic, (6) faith-based, and (7) less entrepreneurial with IT are classified as symbolic adopters. Their findings indicate that symbolic adoption diminishes the effectiveness of IT security investments, resulting in an increased likelihood of breach. Contrary to their theorizing, the use of more IT security is not directly responsible for reducing breaches, but instead, institutional factors create the conditions under which IT security investments can be more effective.
Angst, C.M., Block, E.S., D’arcy, J. and Kelley, K., 2017. When do IT security investments matter? Accounting for the influence of institutional factors in the context of healthcare data breaches. MIS Quarterly, 41(3), pp.893-916.
Although organizations take numerous approaches to secure their IT asset, data breach incidents still happen very frequently across industries. To investigate the underlying reasons, this study examines the effectiveness of IT security investment in the healthcare industry through the lens of Neo – institutional theory, specifically, how symbolic adoption and substantive adoption influence the success of security investment, what kinds of institutional factors associated with the two types of adoptions, why IT security investment has a delayed effect.
According to the institutional theory, the motivation to adopt a practice is not only by actual benefit, but also to seek legitimacy in the social structure. In most cases, this pressure can result in symbolic adoption, which means practice would not be fully implemented and their benefit would not be maximized. However, symbolic adoption and substantive adoption can not be observed directly, so the paper suggests several organizational characteristics that can predict each type of adoption. Combining with theory and available data, the study proposes that these characteristics can contribute to symbolic adoption: 1. members of smaller health systems; 2. long established; 3. smaller size; 4. for-profit; 5. teaching ; 6.faith-based;7. less entrepreneurial ; then it further suggests 1. more IT security investment will reduce breaches; 2. substantive adoption will enhance the effectiveness of IT security investment over time; 3.symbolic adoption will decrease the effectiveness of IT security investment in the over time.
It collects hospital level data from HIMSS and data breach incidents from different sources. Then, it uses a growth mixture model (GMM) for dichotomous outcomes to test hypotheses. A part of hypotheses are supported by the results.
Kwon, J. & Johnson, M. E. (2014). Proactive versus reactive security investments in the healthcare sector. MIS Quarterly, 38(2), 451-471.
The legislative mandates to disclose security breaches coupled with the detailed data available on security investments makes the healthcare sector a viable industry to consider the impact of proactive versus reactive security investments. Proactive investments are those conducted by the organization prior to an issue or breach. Conversely, reactive investments are those that occur after an incident to respond to said incident. Data from the Healthcare Information Management Systems Society from 2005-2009 was collected to allow for a Cox Proportional Hazard Model to be performed on a sample of 2,386 healthcare organizations. The analysis provided a perspective considering the learning opportunity benefits and the cost effectiveness of healthcare security investments.
Results indicated that proactive security investments are associated with lower security failure rates. This supports the notion that attackers’ abilities and threats evolve at such a rapid pace that it is important to learn from proactive initiatives. Proactive investments are also associated with smaller breaches and lower breach notification costs than reactive investments. This finding is contrary to literature stating that proactive investing results in overinvestment stemming from uncertainty. Results also indicate that security investments have more significant effects on external than internal failures. Regardless of investment type, effects are larger at the state level than the organization level indicating that security investments create positive externalities (i.e., they improve security for all parties involved). Results also indicate that voluntarily made proactive investments are associated with superior performance to those made by external pressure. With external regulatory mandates, the organization might be trying to meet the mandates rather than conducting a threat analysis themselves. Finally, though external regulatory requirements attenuate learning from proactive investments, the requirements are at least not hurtful in failure-induced learning from reactive investments. Effectively, results indicate that CIOs should further emphasize proactive initiatives rather than purely reacting.
|Kwon and Johnson (2014)||Heather|
|Wang et al. (2015)||Joe|
|Mitra and Ransbotham (2015)||Sid|
|Angst et al. (2017)||Jack, Leting|
|Pang and Tanriverdi (2017)||Xi|
The paper talks about the interesting debate between privacy regulations and HIE incentives.On one hand,patient consent requirements add administrative costs and restrict the availability of patient information.On the other a system that assumed their willingness to participate without obtaining explicit consent (i.e., an opt-out system) would not be acceptable.Thus, policy makers seeking
to foster the growth of HIE efforts face the same challenge that emerges in other industries: how to address privacy concerns without over-regulating the disclosure of personal information and stifling the growth and emergence of valuable information technology efforts reliant on it.Thus the authors want to explore whether different forms of privacy regulation enable or impede HIE efforts.They posit that incentives could offset the significant costs associated with HIE efforts, including those that arise from varying degrees of privacy regulation.Using semiannual data from a six-year period (2004–
2009)the authors use an empirical strategy taking advantage of the fact that across different states policy makers have approached HIE challenges in different ways, enacting legislation that varied both in terms of the incentives they create for HIEs, and in terms of the types of privacy protections they afford to patient data exchanged through HIEs. The empirical investigation includes a fixed effect model(the main analysis) followed by a cross sectional model to look at underlying factors and also ruling out confounding explanations.The authors also do a series of robustness checks to rule out any endogeniety concerns.Although results show that privacy regulation without incentives had a negative effect on HIE efforts, we also find that privacy regulation, particularly regulation that includes consent requirements, was a necessary condition for incentives to positively impact HIE efforts. Incentives coupled with privacy regulation that included requirements for patient consent resulted in a 47% increase in the propensity of an HRR having a planning HIE and a 23% increase in the propensity of an HRR having an operational HIE.By contrast incentives without any privacy regulations/with privacy regualtions but which didn’t have any consent requirements resulted in little/no gain.The results contribute to literature in adoption and the diffusion of IT in healthcare—in particular, the factors and barriers that impact their adoption and is one of the first studies to examine the impact on the emergence of planning and operational HIEs of varying approaches to privacy regulation.It also contributes to the economic and policy literature evaluating the impact of privacy protections on technological progress by showing that HIE incentives consistently
offset the negative baseline effects of privacy regulation on HIEs and, more surprisingly, that incentives
were more effective in doing so when coupled with privacy regulation that included consent requirements.
Ayabakan, S., Bardhan, I., Zheng, Z.E. and Kirksey, K., 2017. The impact of health information sharing on duplicate testing. MIS Quarterly, 41(4), pp.1083-1103.
Duplicating tests in healthcare create redundant costs for both patients and the insurance providers and information sharing by integrated healthcare system could reduce duplicating tests. Because the formats and frequency of testing information storage are different between radiology tests (low volume but high cost) and laboratory tests (high volume, low cost, manually processed). The authors postulate that implementation of health information sharing technologies will reduce the duplication rate for duplicating tests, and the reduction is more salient for radiology tests compared to laboratory tests, especially when health information sharing technologies are implemented across disparate provider organizations.
The authors utilize a unique panel dataset consisting of around 40,000 patients visits to the outpatient clinics and hospitals for laboratory and imaging tests related to the diagnosis and treatment of congestive heart failure with a quasi-experimental approach. The results of the paper support the authors’ hypothesis that implementation of information sharing system could reduce the duplicating testing for patients.
Bhargava, H. K., & Mishra, A. N. (2014). Electronic medical records and physician productivity: Evidence from panel data analysis. Management Science, 60(10), 2543-2562.
Contrary to decision-makers in other industries, physicians in healthcare sector perform not only knowledge work, such as making decisions and crafting treatment regimen based on patient information, but also data entry and system operation with the wide adoption of EMR. More, they are the healthcare practitioners who drive a majority of care decisions. Therefore, EMRs hold the potential to improve physician workflows and productivity, and, consequently, contain healthcare costs. This paper attempts to examine two important research questions: (1) Does physician productivity change over time as a result of EMR implementation? (2) Does this impact differ for physicians of different specialties?
Measuring physician productivity is challenging, the authors argue that using WRVUs-relative value units generated for clinical activities rather than administrative, teaching, training, or care coordination activities, to measure physician productivity-can overcome the traditional measurement drawbacks, lack of robustness and normalization. More, the theory of Task-Technology Fit(TTF) indicates the heterogeneity of the effects of EMR implementation on physician productivity. Using the dataset, which contains 3,186 physician-month productivity observations collected over 39 months may suffer from OVB, self-selection bias, and attenuation bias when constructing the OLS casual model. The authors then use a Differences-in-Differences model and Arellano–Bond GMM to relief these endogeneity concerns. Their results show that that productivity drops sharply immediately after technology implementation and recovers partly over the next few months. The longer-term impact depends on physician specialty. The net impact of the EMR system is more benign on internal medicine physicians than on pediatricians and family practitioners.
The authors find that on one hand, present-day EMR systems do not produce the kind of productivity gain that could lead to substantial savings in healthcare; at the same time, EMRs do not cause a major productivity loss on a sustained basis, as many physicians fear. Other implications and contributions are also discussed.
Bhargava, H. K., & Mishra, A. N. (2014). Electronic medical records and physician productivity: Evidence from panel data analysis. Management Science, 60(10), 2543-2562.
This paper examines the impact of EMRs on physician productivity. There are two specific questions: 1. Does physician productivity change over time as a result of EMR implementation? 2. Does this impact differ for physicians of different specialties?
The conceptual foundations for this study mainly based on three streams of literature. The first is physician productivity, WRVUs are used to measure it. they are relative value units generated for clinical activities; the second stream is IT- Enabled productivity, extant research shows there may be significant differences between IT’s impact during short-term and long-term; the third stream is Task-Technology Fit, the paper points out the two main functions of EMRs are information review and information entry, given that physicians of different specialities have different demand for the EMRs, the productivity impacts of EMRs on them are likely to vary.
This study uses data includes monthly physician schedule and production in a healthcare system from 2003 to 2006, in the period, EMR system was implemented across clinics. Some exploratory data analyses show EMR’s impacts on productivity are significantly different in first months and after six months, then they estimated the learning period empirically. Next, they use OLS to estimate the model which accounts for the heterogeneous in physicians and clinics. Lastly, they use Arellano- Bond system GMM estimation which eliminates bias from unobserved heterogeneity by first-differencing and from endogeneity by using instrumental variables of available lags and levels.
Results show 1. FPs and Peds are less productive in the stable phase in comparison to IMs. The net impact of EMRs is better on IM than FPs and Peds. 2. FPs and Peds experience a decrease in productivity compared to IMs in the learning phase.
Menon, N. M., & Kohli, R. (2013). Blunting damocles’ sword: A longitudinal model of healthcare it impact on malpractice insurance premium and quality of patient care. Information Systems Research, 24(4), 918–932.
Few studies have considered its implication on product or service quality, this study fills the gap by investigate the impact of healthcare IT (HIT) expenditure on the malpractice insurance premium (MIP) and the moderating effect of HIT expenditure on the relationship between MIP and patient care quality. Based on the prior literature of IT value and risk management, three hypotheses are proposed. 1) Past HIT expenditure is positively associated with current quality of patient care. 2) Past HIT expenditure is negatively associated with MIP. 3) Past MIP is positively associated with quality of patient care. 4) The relationship between past MIP and quality of patient care is enhanced (positively moderated) by past HIT expenditure.
There are multiple data sources in this study. Hospital information is from the Washington State Department of Health. Patient care outcomes are obtained from a data services and consulting company. Three regression models are applied. To control for the likely persistence in organizational decisions and actions from year to year, lag of dependent variables is added. First-differencing method is applied to address the fixed effect factors. Some instrument variables are used to overcome the endogeneity problem. Results support all the hypothesis but H4. The moderating impact of HIT on quality of patient care through its impact on the link from past MIP to quality of patient care is negative. The findings offer opportunities for future research.
This study contributes to understanding the expectation of IT benefits and its effect on an organization. It also informs decision makers in risk, quality management, and the IT function to engage in joint risk mitigation decisions to achieve desired organizational goals.
Atasoy, H., Chen, P., & Ganju, K. (2017). The spillover effects of health IT investments on regional healthcare costs. Management Science, forthcoming, 1-20.
Past research efforts have consistently demonstrated the quality benefits associated with the implementation of electronic health records (EHR), but research considering the cost on health care is more scarce and mixed. Atasoy, Chen, and Ganju (2017) shed light onto this research gap by considering the spillover effects of health IT investments on neighbouring healthcare providers’ costs. Moreover, Atasoy et al. (2017) approached this research from a macro-level perspective by considering how one hospital implementing an EHR system impacts the costs for them as well as for surrounding hospitals. This approach is relevant as healthcare is often a community effort with hospitals sharing patients. Thus, any variable that decreases communication costs between hospitals, such as EHR systems, will lower the financial burden for all parties in the supply chain. In order to verify this proposition, data was collated from a variety of sources including the Healthcare Information Management Systems Society database for the longitudinal period from 1998 to 2012.
Analyses considering said dataset found that though EHR system adoption increases costs at the adopting hospital, it lowers costs at surrounding hospitals. Specifically, the spillover effect is stronger when an increasing number of hospitals in the region are in health information exchange networks and in the same integrated delivery systems since these networks and systems facilitate information exchange. Moreover, for hospitals with regional characteristics that facilitate patient sharing, such as urban vs rural areas, population density, average distance between hospitals, and hospital density, the spillover effect is more pronounced. Finally, the HITECH Act, which increased the adoption and use of EHR systems, catalyzed the spillover effects. Effectively, a macro-level investigation into EHR systems indicates that they can reduce costs for collaborating hospitals.
The paper talks about how information technology is a decentralizing force, whereas communication technology is a centralizing force. The paper argues that these technologies have at least two distinct components, information technology (IT) and communications technology (CT). The paper studies the differential impact on the organization of firms of these two types of technology (information versus communication) and applies this framework in a world with two types of decisions, production and nonproduction ones. Results show that, technologies that lead to falling information costs for nonproduction decisions (like ERP) tend to empower plant managers (relative to the headquarters), and technologies that lead to falling information costs for production decisions (like CAD/CAM) tend to empower workers relative to plant managers. In other words, a technology that lowers information costs increases the autonomy of the lower-level agent (a worker in the production case, a plant manager in the nonproduction case), whereas a technology that lowers communication costs reduces this autonomy. The study relies on a new data set that combines plant-level measures of organization and ICT hardware and software adoption across the United States and Europe as part of a large international management survey. For identification, the authors rely on simple conditional correlations between the different ICT measures and the multiple dimensions of the organization of the firm. Instrumental variables show increased robustness of the results. The work solves the conundrum in literature that takes s information technology (IT) and communication technology (CT) into a single homogeneous category and shows that e impact of IT and CT on the organization of firms, and ultimately income inequality, will be quite different depending on the type of technology used.
Lamar Pierce, Daniel C. Snow, Andrew McAfee (2015) Cleaning House: The Impact of Information Technology Monitoring on Employee Theft and Productivity. Management Science 61(10):2299-2319.
Employee theft and fraud are widespread problems in firms, with workers stealing roughly $200 billion annually from U.S. firms to supplement their income. A growing empirical literature clarifies when and how theft and other misconduct occur but says little about the overall impact of firms’ use of forensics to monitor and reduce theft. This paper examines how firm investments in technology-based employee monitoring impact both misconduct and productivity by addressing three important yet unresolved questions: 1) is employee monitoring indeed effective in reducing theft, as economics suggests, or does monitoring demotivate and constrain employees and thus negate gains from theft reductions? 2) do possible gains from monitoring result from changing worker behavior or from replacing unethical workers with more honest ones? 3) if increased monitoring indeed reduces theft by existing workers, then through which mechanisms does productivity in other tasks change and what is the overall impact to the firm?
The paper initiates with phenomenon-driven questions while exploring the underlying mechanisms of economics, phycology, and behavior aspects using theft and sales data from 392 restaurant locations from five firms. After using a restaurant-level and individual worker-level Differences-in-Differences model, the authors find significant treatment effects in reduced theft and improved productivity. More, they dig more on the mechanism identification: the majority of productivity improvement and theft reduction is due to behavioral changes among existing workers rather than selection effects due to managers replacing problem workers revealed by the IT system. Furthermore, the authors explore the deeper mechanism: economic multi-tasking, cognitive multi-tasking, motivation from fairness or perceived increases of general oversight. Their results cast significant doubt on both the cognitive and economic multitasking mechanisms, and provide mixed evidence on fairness concerns. Although we cannot directly test for perceptions of increased productivity monitoring, this explanation seems most consistent with our results. Implications are also discussed.
Tambe, P., & Hitt, L. M. (2014). Job Hopping, Information Technology Spillovers, and Productivity Growth. Management Science, 60(2), 338–355.
Like earlier general-purpose technology such as electricity or the steam engine, researchers argued that information technology (IT) investments generate productivity “spillovers” among firms, and the movement of IT workers among firms is believed to be an important mechanism by which IT-related innovations diffuse throughout the economy. In this paper, the authors use employee microdata obtained from an online resume database to test the hypothesis that firms benefit from the IT investment of other firms because the flow of specialized technical know-how among organizations facilitates the implementation of new IT innovations.
Based on the literature on the impact of R&D spillovers and IT investment on productivity, this study modeled the knowledge available to the focal firm as the weighted sum of the knowledge of other firms in the sample, and the transfer of IT-related knowledge occurs through the mobility of workers. Two IT investment measures are applied: IT capital stock-based measure and IT labor-based measure. The external IT pool measure is computed as the IT intensity of other firms. OLS and Fixed Effect model are used to support the hypothesis of productivity spillover effect through the IT labor flow. This study also compares the benefits through IT labor flow with that through the IT investments of geographically proximate firms, the estimators indicate that regional spillovers appear to be driven by IT labor flows, and IT labor flows appear to be an important source of spillovers even outside a fixed region.
This study is the first to analyze how IT labor flows drive IT spillovers and, to investigate this issue using microdata on labor mobility. It suggests that a substantial amount of variation in IT returns can be explained by productivity spillovers generated by IT labor flows.
Atasoy, H., Banker, R. D., & Pavlou, P. A. (2016). On the Longitudinal Effects of IT Use on Firm-Level Employment. Information Systems Research, 27(1), 6-26.
This study examines how IT use affect firm-level employment, more specifically, how web and enterprise applications differentially play a role in the firm’s employment, how firm size, the average skill level of its employees, industry technology intensity, as moderators influence IT impacts on employment.
In the theory part, the paper specifies three mechanisms behind this impact: productivity gains, make versus buy decisions, labor complementarity versus substitution. then differentiating enterprise applications and web applications, because the implementation of enterprise applications require more investments and organizational change, so it takes more time to materialize them compared to web applications. Next, it illustrates moderator’s role, specifically, the materialization of IT is more slowly in larger firms; IT use would have a stronger role in employment for firms with high level of skills and firms in the high-tech industry.
This study uses a firm-level survey from TurkStat which has several advantages, compared to other commonly used datasets, it is more representative, more granular and it covered smaller firms. Fixed effects model is used to examine the relationship between IT usage and employment. It also uses several strategies to deal with endogeneity, including using a series of control variables, analyzing the timing of changes in IT use and employment, and generalized propensity score. The results are robust.
The paper has several conclusions. Firstly, there is a positive relationship between IT use and firm-level employment on average; furthermore, the effect of enterprise applications is lagged, but the use of web application materialize in the current year; the longitudinal impact led by the use of enterprise application is more salient in larger firms with higher average wages in high-tech industries, while the current effects of the use of web application are more pronounced in small firms. It provides several implications for public policy.
Bloom, N., Garicano, L., Sadun, R., & Van Reenen, J. (2014). The distinct effects of information technology and communication technology on firm organization. Management Science, 60(12), 2859-2885.
Information Communication Technologies (ICTs) have radically impacted industries and the roles of various employees. However, this impact has not been uniform across industries, positions, etc. For example, ICTs have rendered Ambassadors mute because technology now makes it relatively easy to be in contact with the actual country leaders regardless of geographic distance. On the other hand, nurses’ responsibilities and capabilities have grown immensely due to ICTs as it allows them to be able to accomplish tasks previously requiring doctors or superiors. Bloom, Caricano, Sadun, and Van Reenen (2014) propose that the differing impact of ICTs could be due to its dual-component nature; effectively, due to differences in information technologies (ITs) and communication technologies (CTs).
ITs allow employees at lower levels to make impactful decisions, such as the case with nurses, due to the increase in information readily available to them. Effectively, ITs, specifically ERPs for non-product related decisions and CAD/CAM for product decisions, allow lower level employees to gain access to information traditionally only available to high level employees at little to no costs. Furthermore, these technologies widen the manager’s span of control. On the other hand, CTs, such as intranets, lead to more centralization as it is easier to communicate and in theory could render more decisions or more verifications to higher level employees. That being said, results considering CTs are more ambiguous than those for the ITs. Overall, Bloom et al. (2014) were able to draw these conclusions by combining the CEP management and organization survey and the Harte-Hanks ICT panel to create a comprehensive dataset that spanned industries and countries. Overall, these findings help to explain the contradictory impacts of ICTs by highlighting the distinct components of ITs and CTs and their differing impacts.
Atasoy, H., Banker, R.D. and Pavlou, P.A., 2016. On the Longitudinal Effects of IT Use on Firm-Level Employment. Information Systems Research, 27(1), pp.6-26.
It is a critical question to understand how IT investment could affect the firm-level employment with argument that whether IT investment would replace human labor with automation or improve workers’ productivity, and the authors examine the longitudinal role of IT use in the firm’s total number of employees. The dataset covers firms with different sizes in various industries from Turkey and captures the firm-level applications of different enterprise software and systems such as ERP, CRM and web applications.
The empirical specifications exploit both within-firm and between-firm variations to show the positive effect of IT use on firm-level employment, which varies across IT applications over time. Interestingly, they find that the effects of the use of enterprise applications materialize after two years, whereas the effects of the use of Web applications are realized in the current year. The authors also explore the moderating role of different factors such as firm size, industry technology density, and average salary rate. They find that long-term effects of the use of enterprise applications on firm-level employment are more pronounced in larger firms, with higher average wages, and in high-technology industries.
Paper: Information Technology and Administrative Efficiency in U.S. State Governments – A Stochastic Frontier Approach
While extant literatures discussing IT value in organizations focus mostly on for-profit institution, this paper provide a new perspective to understand how IT system and investment could improve the efficiency of government and public administration. Analyzing a combination dataset from the IT budget data in state governments, the census data on state government expenditures, and a variety of information on public services states provide, the authors measure technical efficiency with a stochastic frontier analysis with a translog cost function and estimate the effect of IT spending on efficiency. The analyses provide evidence for a significantly positive relationship between IT spending and cost efficiency and indicate that on average, all else being equal, a $1 increase in per capita IT budget is associated with $1.13 efficiency gains.
Since state governments have used IT extensively for internal administration and delivery of public services such as education, social welfare, healthcare, and law enforcement, it is critical from both taxpayer perspective and IT investment planning perspective to understand whether the adoption and investment of IT infrastructure indeed generates sufficient value to the public. The authors adopt a stochastic frontier approach with a cost function, whose data are collected from NASCIO and the census data on state government finances, to estimate the relationship between IT spending and cost efficiency of state governments. The authors find that the relationship is positive and statistically significant, suggesting that the higher IT investments, the greater state government efficiency is.