Information Disclosure and the Diffusion of Information Security Attacks Mitra & Ransbotham–Siddharth Bhattacharya
The paper talks about an ongoing debate in the research community about limited versus full disclosure about vulnerabilities that are often attacked on by third parties.Proponents of limited disclosure argue that it ensures that vendors and targets receive reasonable time to develop and deploy patches and countermeasures before systems are attacked,whereas the alternative full disclosure creates a window of opportunity for attackers before patches and countermeasures are deployed.On the other hand, full disclosure provides incentives to vendors to create better quality software and notifies security professionals so that they can install countermeasures immediately.Thus the paper wants to answer the questions:does full disclosure speed the diffusion of attacks corresponding to the vulnerability through the population of target systems?does full disclosure increase the risk that a firm is attacked for the first time on any specific day after the vulnerability is reported, given that it has not been attacked prior to that day?does full disclosure increase the number of target firms affected by attacks based on the vulnerability?does full disclosure increase the volume of attacks based on the vulnerability?.
The authors use measures developed in earlier literature to first form analytical estimates of various measures such as Na(t) cumulative number of attacked systems at time t,cumulative number of protected systems at time t ie Np(t) etc and then uses these for development of their hypothesis.Next,the authors augment this analytical analysis with two main data sources :a proprietary database of alerts generated from intrusion detection systems (IDSs) installed in client firms of an MSSP during 2006 and 2007 and second combine this panel data set with dates from the National Vulnerability Database (NVD) to obtain detailed characteristics of the vulnerabilities.
The authors use a series of models:a non-linear regression model,a cox model and finally a poisson model to corroborate all their hypotheses,all of which are supported.Results indicate that full disclosure accelerates the diffusion of attacks corresponding to a vulnerability. Full disclosure also increases the risk of first attack on any specific day after the vulnerability is reported.Full disclosure also increases the penetration of attacks within the population of target systems.Additionally, although the aggregate volume of attacks remains unaffected by full disclosure, attack activity shifts earlier in the life cycle of a vulnerability, thereby reducing its effective life span but intensifying activity while active.The paper makes several contributions.Practically, quantifying the net effect of information disclosure on the diffusion of attacks informs the continuing debate about the optimal disclosure of information security vulnerabilities.It also adds depth to the debate about limited versus full disclosure and uncover a potential negative effect of full disclosure.Finally it adds to the diffusion of innovation literature by focusing on the diffusion of a societally undesirable innovation versus positive innovations studied before.
p: Sometimes campus visit can be great but exhausting experience.These are often 2 day events(2 dinners) with faculty and can be extremely exhausting.On top of that these are often back to back from different schools making it even more tough.The main idea of the job talk is to collect and consolidate perspectives from all people(all faculty in the department) ,not only from the dean etc.the whole department needs to give their opinion. That’s the reason for full day interviews.Department also shows that we really are a great group and care for future candidate thus building reputation of the school.
S: How is the decision on the job candidate made,is it a single person’s decision or a consolidated effort.
P:In many cases there are a committee who decide about the candidate who to interview and who to invite,its a collective decision.In some cases all people decide and in some cases only senior people decide.One of the important criteria is whether candidate can survive the high pressure environment.
S:Its often a big decision on the part of school since we are adding another permanent member.Apart from that credentials for tenure/productivity also play an important role.
P:There are two kinds of hires one who join fresh out of school and the second are the tenured Professors.The decision on tenured cases are even more important sometimes,because they will become a permanent part of the department,so the school doesn’t want to hire jackasses ie candidates who are very productive but tough to get along with.It is tough to do research and collaborate with these people often and so the school has to be extra careful while hiring.
P:Its often important to impress and make a great impression,if you make a bad impression you are screwed.Even if most of the committee agrees,one strong “no” is often sufficient to be rejected,so everyone’s opinion matters.Its also important to acknowledge shortcomings and answer critical questions asked in the best way possible.For a person who is a jackass he will stress under such pressure and might challenge them back which often leads to rejection .You don’t want to be offensive but also not cave in,its a really balance,its acknowledge that your research has issues.We often want to see how the person can teach top MBA students/exec -MBAs. Its often like a date,to make the date work you have to show commitment and interest in each other,same for the interviews.
S:We School wants to hire people who are committed to the program,who is not going to leave in 2/3 years , as its a huge investment loss,expensive to hire tenure track person.
P:You have to show interest in the city who are going to.You have to take every interview seriously.Lack of interest can be a rejection even for strong profiles.too much movement is often a bad signal.
P: Job talks can often be research collaboration opportunities with faculty members from that school.Candidates can also talk about other interesting questions and miscellaneous stuff though the focus remains on the job market paper.Candidates should also prepare 5 min intro,10 min intro,30 min intros respectively.Its often important to have a strong research portfolio along with various teaching interests to diversify your chances further.
The paper talks about the interesting debate between privacy regulations and HIE incentives.On one hand,patient consent requirements add administrative costs and restrict the availability of patient information.On the other a system that assumed their willingness to participate without obtaining explicit consent (i.e., an opt-out system) would not be acceptable.Thus, policy makers seeking
to foster the growth of HIE efforts face the same challenge that emerges in other industries: how to address privacy concerns without over-regulating the disclosure of personal information and stifling the growth and emergence of valuable information technology efforts reliant on it.Thus the authors want to explore whether different forms of privacy regulation enable or impede HIE efforts.They posit that incentives could offset the significant costs associated with HIE efforts, including those that arise from varying degrees of privacy regulation.Using semiannual data from a six-year period (2004–
2009)the authors use an empirical strategy taking advantage of the fact that across different states policy makers have approached HIE challenges in different ways, enacting legislation that varied both in terms of the incentives they create for HIEs, and in terms of the types of privacy protections they afford to patient data exchanged through HIEs. The empirical investigation includes a fixed effect model(the main analysis) followed by a cross sectional model to look at underlying factors and also ruling out confounding explanations.The authors also do a series of robustness checks to rule out any endogeniety concerns.Although results show that privacy regulation without incentives had a negative effect on HIE efforts, we also find that privacy regulation, particularly regulation that includes consent requirements, was a necessary condition for incentives to positively impact HIE efforts. Incentives coupled with privacy regulation that included requirements for patient consent resulted in a 47% increase in the propensity of an HRR having a planning HIE and a 23% increase in the propensity of an HRR having an operational HIE.By contrast incentives without any privacy regulations/with privacy regualtions but which didn’t have any consent requirements resulted in little/no gain.The results contribute to literature in adoption and the diffusion of IT in healthcare—in particular, the factors and barriers that impact their adoption and is one of the first studies to examine the impact on the emergence of planning and operational HIEs of varying approaches to privacy regulation.It also contributes to the economic and policy literature evaluating the impact of privacy protections on technological progress by showing that HIE incentives consistently
offset the negative baseline effects of privacy regulation on HIEs and, more surprisingly, that incentives
were more effective in doing so when coupled with privacy regulation that included consent requirements.
The paper talks about how information technology is a decentralizing force, whereas communication technology is a centralizing force. The paper argues that these technologies have at least two distinct components, information technology (IT) and communications technology (CT). The paper studies the differential impact on the organization of firms of these two types of technology (information versus communication) and applies this framework in a world with two types of decisions, production and nonproduction ones. Results show that, technologies that lead to falling information costs for nonproduction decisions (like ERP) tend to empower plant managers (relative to the headquarters), and technologies that lead to falling information costs for production decisions (like CAD/CAM) tend to empower workers relative to plant managers. In other words, a technology that lowers information costs increases the autonomy of the lower-level agent (a worker in the production case, a plant manager in the nonproduction case), whereas a technology that lowers communication costs reduces this autonomy. The study relies on a new data set that combines plant-level measures of organization and ICT hardware and software adoption across the United States and Europe as part of a large international management survey. For identification, the authors rely on simple conditional correlations between the different ICT measures and the multiple dimensions of the organization of the firm. Instrumental variables show increased robustness of the results. The work solves the conundrum in literature that takes s information technology (IT) and communication technology (CT) into a single homogeneous category and shows that e impact of IT and CT on the organization of firms, and ultimately income inequality, will be quite different depending on the type of technology used.
The paper tries to investigate how the diffusion of the internet influences research collaboration within firms.Previous literature has suggested that research collaboration is hampered by the existence of significant coordination costs that increase with team size and that adoption of information technology such as internet lower coordination costs and thus increase returns of collaborative work.However although some works exist as to whether IT adoption helps academic collaboration no such work exists for industrial collaboration.This is the first work to do so.Motivating the hypothesis using prior models of Becker and Murphy(1992) the authors hypothesize that increase in IT investment(here internet adoption) will be associated with an increase in the likelihood of geographically dispersed, multi inventor collaboration relative to collaboration within the same region/single inventor outputs.The data comes from multiple sources including:patenting data from USPTO,R&D data from Compuestat,regional controls from US census county business patterns.The analysis used is a difference in difference framework, comparing the incidence of a collaborative patent in firm-location pair prior to the treatment of basic Internet adoption to the incidence after the treatment .The model is a fixed effects linear probability model The model controls for observable changes in firm-pair conditions,fir-location employment etc which could affect collaboration volume.It also controls for location fixed effects.The diff-in-diff results show that there is a statistically significant increase in the incidence of collaborative patenting for cross-location pairs adopting Internet over the period, relative to non adopters.This however is not the case for same location teams or single inventors.To test the robustness of the findings the paper uses falsification tests and instruments to rule out endogeniety. The results remain robust and consistent.The research tries to answer a debate in previous literature of whether adoption of IT can lead to increase in collaboration due to reduction of coordination costs and managerially has implications for integration of geographically dispersed organizations/long run design of research organizations within firms.
Traditionally , outsourcing deals require long term contracts between clients and the vendor and this has been a major area of focus of most of the previous literature. This has also lead to frustration on the part of pretensioners who seek sustainable value from outsourcing. The paper focusses on renegotiation of contracts and investigates the role of decision rights delineated ex ante that enable Pareto improving amendments. Specifically, the authors looks at the effect of flexibility provisions, termination of convenient rights and redoployability rights on part of the vendor to areas outside the contract on likelihood of pareto improving amendments against the default outcome of parties completing the contract without ex post renegotiation.
Two empirical challenges had hampered previous studies namely: lack of good data and lack of rigorous identification strategies. The author was able to overcome this problem by using 10-Q, 10-K, and 8-K filings of firms(SEC) coupled with data from press releases, trade and business reports and press releases from client and vendors.The authors use a probit model for their initial analysis to find the probability of presence of Pareto improving amendment of a contract between client and vendor where as the Ivs include decision rights delineated ex ante, the characteristics of the task contracted upon, contractual provisions, vendor specific characteristics and other client characteristics(including other controls). The results show that flexibility provisions, termination for convenience rights, and contractual rights whereby vendors are granted rights to reuse know-how result in Pareto improving amendments. . The results are robust to potential endogeneity of contractual provisions when parties have the feasible foresight and to the possibility of adverse selection in the sample.
The research contributes to literature by bringing in the angle of renegotiations in contractual decisions highlighting the importance of renegotiation design in enabling adaptation ex-post. It also has implications for practice. Few limitations of the analysis include long term knowledge spillovers(due to redeployability) and noncontractual mechanisms of enforcement which may affect ex post renegotiation.
The paper talks about the effect of IT deficiencies on higher executive(CEO/CFO) turn over in firms. Most of the previous research has applied the framework that the greater the shared knowledge and mutual trust among top executives greater the success among IT firms.This body of knowledge has not shed any light on which senior executives should take care of which specific IT management activities etc.Thus the paper tries to answer for which IT management responsibilities are particular senior executives held accountable for serious IT deficiencies?.The paper takes advantage of the Sarbanes-Oxley Act of 2002 (SOX) which was established to strengthen internal controls over financial reporting by U.S. public firms helping capture IT material weakness.Te paper next goes on to develop multiple hypothesis that firms reporting higher number of IT material weaknesses will experience greater likelihood of CEO/CFO turnover.Next,the authors divide the material weakness into categories of global IT material weakness,demand and supply side IT materiel weakness and posit that firms reporting higher number of global/demand side IT material weaknesses will experience greater likelihood of CEO/CFO turnover.Using data from Audit Analytics, authors examined each firm’s reported material weaknesses and classified each as either an IT material weakness or a non-IT material weakness.Then they combined data from SEC proxy statements to identify CEO/CFO turnovers.The analysis involves probit regression with CXO turnover as dependent variable and NUmber of IT weaknesses/number of non-IT weaknesses as IVs.The results show Number of IT Weaknesses is significant in predicting CEO
turnover.Results also show that IT Architecture,IT Control Oversight–External are significant predictors as well.It also shows that IT Control Oversight–Internal to be a strong predictor.The results are robust and Heckman model controls for any endogeniety concerns due to selection bias.The the findings suggest that CEOs and CFOs were observed to be selectively affected by serious IT deficiencies. For CEOs, deficiencies traced to IT Architecture and to IT Control Oversight–External were associated with higher turnover likelihoods. For CFOs, deficiencies traced to IT Control Oversight–Internal were associated with higher turnover.Inconsistencies and limitations of the study are discussed.Contributions to IS,managements and practice are discussed.
The paper investigates software process diversity and what are its antecedents.It further links software process diversity and organizational process compliance to software project performance.To this end,the paper next defines key process area(KPA) in any project and the various dimensions (separation,variety,disparity) of KPAs implemented in any project.To get relevant measures of the constructs the authors use an innovative discovery based research approach where they dig deep into literature first and then see its relevance to actual practice to construct various measures (examples include code size,degree of customer involvement etc).The data comes from a leading multinational software company which was renowned in process innovation.Focus groups,meetings and individual interviews were carried out to find out variations such as plan versus process based process design.Next the paper goes on to develop 5 hypotheses connecting software process diversity,organizational compliance and fit between process diversity and process compliance to project performance.The DV for the analysis project performance consists of two variables:software productivity and quality.The model is divided into 2 parts:the first one measures the impact of antecedents on process diversity and findings lead credence to the hypotheses that requirements volatility,design novelty and customer involvement all had a positive and significant effect on process diversity.From the second analysis the authors conclude that indeed better fit between process diversity and process compliance efforts yielded higher productivity and quality respectively.On the other hand control variables show that requirement volatility,software code and large team size have a negative effect on performance.Similarly,too much customer customer involvement also leads to a negative effect.The paper does a bunch of robustness checks to confirm the results.Thus the paper draws on prior work on organizational and demographic diversity and extends the literature by looking at software diversity from lens of variety and disparity dimensions.It extends the argument on planning versus agility and informs the literature to look beyond these.Implementable guidelines for practitioners are also highlighted.
Competing in Crowded Markets: Multimarket Contact and the Nature of Competition in the Enterprise Systems Software Industry-Siddharth Bhattacharya
The ESS market has grown in size and consists of several horizontal markets around specific software components. Although individual ESS firms do not develop all the software components, each firm typically offers several of these components. ESS firms should benefit from competing in as many product markets and serving as many vertical industry segments as possible. Theories of multimarket contact suggest that ESS firms can engage in tacit collusion and mutual forbearance to reap performance benefits. Second, many of the markets where ESS firms operate are crowded with rivals. Crowded markets subject firms to high market domain overlaps with their industry rivals and adversely impact their performance. Thus the paper asks the following research questions: Should ESS firms compete in as many markets as possible? Second, are they better off by avoiding crowded markets?, Do they benefit by repeatedly facing rival firms in multiple markets. The data comes from an unbiased industry group(Reed Elsevier Inc.) that employed a consulting organization to collect revenue and other information for nearly a complete set of ESS firms through surveys every other year. Additional elements such as firm size and alliances are acquired from other independent sources including the Mergent Online company database, Security and Exchange Commission filings, Gale Group database, and One Source Business Browser. The empirical analysis consists of a time fixed effects, random-coefficients regression. Results show that once a firm sells a component it benefits by creating templates for different vertical industry segment. Higher multimarket contact leads to better firm performance. Externality benefits might indeed be diluted by heightened rivalry. Finally, results reveal that firms operating in crowded markets and experiencing high market overlap can enhance their performance through multimarket contact. Robustness checks are put in place to rule out endogeneity concerns and validating the use of the random effects model.
The paper develops a conceptual model to understand the effect of cross-functional coordination and cross-organizational coordination on firm-level manufacturing performance.Before this study,previous papers have talekd about manufacturing performance in the context of interfunctional and interorganizational coordination, but have proceeded as parallel literature streams.This paper provides a more wholistic picture by considering integrated information systems construct and manufacturing-IS coordination in conjunction with manufacturing-marketing and manufacturing-supply chain coordination.The paper goes on to hypothesize that coordination between a firm’s manufacturing and marketing functions,between manufacturing and supply chain partners and manufacturing and IS functions will have a positive relationship with integrated IS capability.The data comes from PIMS email contacts provided by APICS. To measure each coordination activity the authors use scale for each while creating a composite measure for manufacturing performance metrics comprising of operating margins, on-time (backlog) ratios, and inventory turns.The authors use an OLS specification for measuring manufacturing performance.The IVs include coordination activities, IS integration capability and their interaction.To rule out endogeniety the authors employ a number of robustness checks including a heckman model to rule out endogeniety of manufacturing performance.Due to the survey nature of the design there are few concerns like respondent bias which the authors acknowledge as limitations of the design.The final results show that firm’s integrated IS capability, and its complementary effect with other verticals(manufacturing, marketing, and supply chain) are significant predictors of manufacturing performance.
Two perspectives have been persistent in previous research. One, says that in the face of disruption the reactive ability of firms determine how much superior profits it can garner until a new equilibrium is reached. Two, that firms are more interconnected than ever before often sharing and utilizing each other’s resources and inputs. This paper asks the research question of how operation disruption of firms which is dependent on shared resources affects not just the disrupted firm but also other firms that are sharing the resource. Further, the paper wants to investigate whether this effect is moderated by the routine complexity of the firm who is disrupted and also their own routine complexity. The context of the research is the US airline industry and especially episodes of major disruption (37 between 2000-2017 out of which 4 major ones are taken into account here). The shared resource is the airports while the performance data is gathered from BTS On-time Performance(OTP).The paper uses simple OLS specification while trying to measure the departure delay, arrival delay and excess travel times as measures of on-time performance. With a data set of 75,051 flights during the 4 disrupted periods, the paper finds that disruptions to one airline may have negative effects on the on-time performance of competitors who share the same resource. Further, results indicate that when FSCs ie those with more complicated routines are disrupted there are significant and negative spillovers onto competing LCCs and FSCs. These results reverse when LCCs are disrupted with positive performance improvements. The analysis section does extensive robustness checks to ensure robust identification. The research has important implications for both firms, IT designers and regulators. The research also contributes to the literature by being the first to show the effect of focal firm’s disruption on competitors with whom it shares resources.
While other departments of universities like Physics, Math etc need PhD students to assist Professors in their research, Business schools don’t necessarily do so. They want to cultivate new scholars in the field. Scholars who create new knowledge. PhD students in business schools bring in fresh perspective and new ideas. The university wants to create independent research scholars for the future with support from faculty members. When these students graduate with great quality papers and go to reputable schools the university’s reputation goes up. Also, when these students make tenure in good research schools the university’s reputation as a research driven school goes up. All this attracts new and better students to the school thus making the entire process cyclic in nature. The school invests a lot in the students and its their duty to reward the school back through not only great research publications but also through increasing the reputation of the school in the broader research community.
The paper asks the important question of whether IT does create value that is reflected in the market value of firms and in their future profitability. This question has been hard to address and there have been varying viewpoints about this topic. Proponents argue that companies can make innovative use of IT in ways that create value by taking advantage of unique resources of a firm. Critics, on the other hand argue that IT is easily replicable and cannot provide sustained competitive advantage as profits obtained from improved business would be competed away. Further, another line of thinking is that firms may develop specific set of architecture and skills that are not easily replicable and thus IT capabilities may themselves become a strategic resource. The paper wants to resolve this conundrum by utilizing the Y2K bug situation in 2000 that had forced IT firms to think about their existing IT strategies seriously.
The paper hypothesizes that there is a positive association between shareholder value and Y2K spending and this is more pronounced in the transform than the informate than the automate industries respectively in that order. It also hypothesizes that there is a negative association between shareholder value and industry level spending on Y2K and that there is a positive association between earnings in future periods and Y2K spendings that are higher for firms in transformate industries versus the other industries. The authors employ a market valuation framework similar to that applied by Lev and Sougiannis’ (1996). The novelty of the method is the use of earnings in one period as predictors of earnings in the future. The results show that firms that spent more in the Y2K period experienced the benefits of improved earnings and increased firm value. These benefits were more pronounced in transform industries versus the other two. These companies, in the long run, were more flexible and were faster to adopt upcoming trends in e-business etc, thus providing them with more benefits. The authors also carry out bunch of robustness checks and rule out endogeneity concerns (using 3SLS etc) keeping the results consistent.