Browsing by Author "Jones, Donald R."
Now showing 1 - 13 of 13
- Results Per Page
- Sort Options
Item A comparison of the DeLone and McLean model of IS success and the work system method: Three field studies in healthcare organizations(2011-05) Lawrence, Richard J.; Jones, Donald R.; Cao, Qing; Hansen, Hans; Hoffman, James J.How good is an information system? Two major approaches to answering this question have been put forward in the literature of information systems: Information System Success models such as DeLone and McLean (1992) and Alter’s (2006) Work System performance indicators. The IS Success Models were primarily for organizing the plethora of (usually subjective) variables in research and secondarily to guide management practice; Alter’s Work System Method was primarily to guide management practice with (usually objective) performance indicators. In this research I conducted field studies of three different work systems in the health care industry. A realized goal of the research was a model that not only integrates both approaches for measuring system success, but also incorporates work system participants’ perceptions of system problems, opportunities, and risks. This new model increases our theoretical and practical understanding of IS Success and enables future research with increased practical implications.Item An empirical analysis of the value of IT investment, outsourcing, and strategy: An agile transaction cost perspective(Texas Tech University, 2007-12) Thouin, Mark F.; Hoffman, James J.; Jones, Donald R.; Ewing, Bradley T.; Ford, Eric W.In a series of three studies, this thesis empirically examines the relationships among determinants of information systems technology (IT) business value. The first study uses the theoretical foundation of production economics to analyze changes in efficiency due to the level of investment in IT, the level of IT outsourcing, and the level of investment in IT personnel. An analysis of 914 Integrated Healthcare Delivery Systems (IHDS) reveals both IT budgetary expenditures and the number of IT services outsourced are associated with increases in IHDS profitability, while increases in IT personnel are not significantly associated with increased profitability. The second study uses the theoretical foundation of transaction cost economics (TCE) to explain the effect of the level of low asset specificity, commodity IT outsourcing on firm-level financial performance. An analysis of 825 Integrated Healthcare Delivery Systems (IHDS) reveals higher levels of commodity IT outsourcing are associated with superior financial performance resulting in an average $3,120,000 in savings, a 25% average increase in profit. The implications are significant in that organizations should use asset specificity to guide outsourcing decisions. The third study uses the General Theory of Competition (GTC) and the theory of competitive agility as theoretical foundations to posit that a firm’s IT agility affects its ability to adapt to environmental change, achieve competitive advantage, and realize superior financial performance. In addition, the effects of commodity IT outsourcing and strategic IT outsourcing on IT agility are examined. An analysis of 822 Integrated Healthcare Delivery Systems (IHDS) reveals increased IT agility is associated with superior financial performance, while commodity and strategic outsourcing are associated with increased IT agility. These findings demonstrate the importance of organizational IT agility in achieving superior financial performance, the positive effect of outsourcing on IT agility, and the relevance of process oriented theories of competitive advantage to the field of management information systems and the study of IT business value.Item An empirical investigation of factors promoting knowledge management system success(2006-08) Thomas, Bobby Dale; Jones, Donald R.; Viator, Ralph E.; Sherif, Karma S.; Westfall, Peter H.The growing popularity of the knowledge-based theory of the firm, the view that organizational knowledge is one of the last remaining sources of long-term sustainable competitive advantage, has led to management’s growing interest in knowledge management (KM) and knowledge management systems (KMS). To date, organizations that have implemented KMS have encountered mixed results. This research contends that existing KM studies fail to give adequate consideration to the importance of KM strategies in determining critical KMS success factors. The rationale behind this research is that by properly considering the moderating effect of KM strategy on the factors that influence KMS success one can explain the success of a KMS (or lack thereof) using a greatly simplified list of success factors. This research draws on existing IS and KM frameworks, models, and literature and selects four organizational factors that are believed to be critical for the success of a KMS; this study hypothesizes which of these factors are more critical for a knowledge exploration strategy (KRS) and which of these factors are more critical for a knowledge exploitation strategy (KIS). A web-based survey utilizing existing scales, some with slight adaptations, and a newly created strategy scale was administered to test the model; 204 complete responses were collected. The results contribute to the literature by empirically confirming the hypothesized positive relationships between the identified success factors and KMS success. This research can serve as a foundation for future studies, which can help identify additional factors critical for KMS success.Item An interdisciplinary outlook of entrepreneurship, software, and information systems(2013-12) Jayaraman, Ganeshmani; Ferguson, Ralph E.; Yadav, Surya B.; Jones, Donald R.This portfolio broadly examines the following disciplines: Software Engineering, Information Systems and Management. The compilation of papers and projects in this portfolio will draw out perspectives from the individual disciplines. The projects synthesizes and bring a cognitive integration into subjects like Software Venture creation, Strategy Mapping of a Software firm and enterprise resource planning. The approach taken is to understand disciplinarian expert heuristics which then is analyzed from a interdisciplnarian view and a conclusively synthesized to remap a integrated comprehensive output or view. First section is a synthesis paper of entrepreneurial model used in technology that synthesizes a "Software Venture creation process". The paper captures the factors and considerations of software idea screening, development of a business and financial model, and other market and industry factors and concerns to be considered in a software venture startup. The second section in this report "Application of Strategy Maps & Balanced Scorecard" explores the framework of strategy maps and balanced scorecards which synthesizes the application of a model strategy map for the software organization. Further it explores why the value chain and value creation strategies in software organizations and how the strategy map is converted to measurable Key performance indicators in a balanced scorecard. The Third section two software projects. first project is a enterprise resource planning of a retail-manufacturing company using SAP-ERP enterprise product. The second project is titled "outreach and engagement inventory" a web based application which evaluates and demonstrate my expert software engineering knowledge in requirement elicitation, software design conception and implementation skills. Collectively the three disciplines have helped to develop an enhanced perspective of a entrepreneur constantly looking to discover newer opportunities and ideas, of a business user to elicit the information needs of business user and of a software professional to go about designing software to meet user needs.Item Bundled transactions of intellectual property: An explanation for the choice of governance form in the IT standard setting(Texas Tech University, 2007-08) Aggarwal, Nitin; Walden, Eric A.; Jones, Donald R.; Browne, Glenn J.; Wilcox, James B.In this thesis, I develop a contingency model to explain the factors affecting the transaction costs in the IT standard setting environment and how the choice of one governance form over the other moderates this effect. The aim is to understand what causes the transaction costs in the IT standard setting process and what governance form best mitigates these costs. The sub-objectives of this thesis are: To conceptualize and promote the intellectual property (IP) view of the technology standards; to identify different types of transaction costs involved in the IT standard setting process; to identify unique characteristics emerging from the IP view of the technology standard that affect the transaction costs; to show that these transaction costs, associated with transacting for intellectual property (IP), exist for all governance forms associated with the standards development process; and finally, to show that different governance forms moderate the effect of these unique characteristics on the transaction costs, making one governance form favorable over the other for a given set of circumstances. To develop the model, I view IT standards as bundles of complementary IP (or methods) that are owned and controlled by multiple entities in multiple different locations and multiple different industries. To implement a standard necessitates transactions of IP between IP owners and standard implementers for the IP rights. However, these transactions are not simple. Due to the legal rights granted by the to the IP owner and the information asymmetry regarding their existence, there are substantial costs involved in locating and procuring the IP rights and later during the implementation of the IP due to infringement issues. Three different transaction costs are considered: the search costs, the coordination costs, and the opportunism costs. The search costs are costs associated with identifying the necessary IP and the IP owner. The coordination costs are costs associated with procuring the IP by entering into negotiations and contracts with the IP owners. Opportunism costs are costs associated with IP infringement and other consequences arising out of IP laws. The unique characteristics, based on an IP view of standards, are identified as the complexity of the IT standard, geographical scope, industrial spread of the IT standard, and substitutability of a method in the IT standard. Complexity refers to the nature and number of interdependencies that exist between various methods that make up the standard. Geographic scope refers to the number of countries where the IP required for the standard may be held. Industrial spread refers to the different technical categories into which the IP may be classified. Finally, substitutability refers to the ability to substitute an IP in the bundle with the next best alternative without grossly affecting the functionality of the standard. Two stylized governance forms considered in the study are hierarchies and networks. Hierarchies are characterized by a single firm (or a dominant firm and a few subservient partners) which controls the standard development process. Networks are characterized as a group of equals where the standard development process is controlled by multiple different entities. The data were collected by surveying experts involved in IT standards development. A total of 436 people responded to the survey for a response rate of 32.0%. OLS regression and logistic regression were used to analyze the data. The results show that there are significant transaction costs for all governance forms. Hierarchies or smaller networks are less expensive than bigger networks to start but lose their edge with the increase in bundle characteristics. The results also suggest that the complexity and geographic scope are significant predictors of search costs, and bigger networks are better at moderating the effect of geographic scope on search costs. The industrial spread and geographic scope are found to be significant predictors of coordination costs, and bigger networks are better at managing the effect of complexity on coordination costs. Finally, complexity is a significant predictor of opportunism cost, and bigger networks are better at moderating this cost than hierarchies. We also found that bigger networks are better at moderating the effect of substitutability on opportunism, and hierarchies are better at moderating the effect of geographic scope on opportunism. Thus, it is concluded that if there is more complexity in a standard and if easy substitutes for patents are not available, then networks are preferred over hierarchies, and in case the standard has larger geographical implications, then hierarchies are preferred over networks.Item Cyberspace security use keystroke dynamics(2015-08) Darabseh, Alaa; Namin, Akbar S.; Hewett, Rattikorn; Jones, Donald R.; Mengel, Susan A.Most of the current computer systems authenticate the user identity only at the point of entry to the system (i.e., login). However, an effective authentication system includes continuous or frequent monitoring of the identity of the user to ensure the valid identity of the user throughout a session. Such a system is called a continuous authentication system. An authentication system with such security scheme protect against certain attacks such as session hijacking that can be performed by a malicious user. Recently, keystroke analysis has acquired popularity as one of the main approaches in behavioral biometrics techniques that can be used for continuously authenticating user. There are several advantages when applying keystroke analysis: First, keystroke dynamics are practical, since every user of a computer types on a keyboard. Second, keystroke analysis is inexpensive because it does not require any additional components (such as special video cameras) to sample the corresponding biometric feature. Third and most importantly, typing rhythms can be still available even after the authentication stage has been passed. A major challenge in keystroke analysis is the identification of the major factors that influence the performance accuracy of the keystroke authentication detector. Two of the most influential factors that may impact the performance accuracy of the keystroke authentication detector include the classifier employed and the choice of features. Currently, there is insufficient research that addresses the impact of these factors in continuous authentication analysis. The majority of exciting studies in keystroke analysis focuses primarily on the impact of these factors in the static authentication analysis. Understanding the impact of these factors will contribute to the improvement of continuous authentication keystroke based system performance. Furthermore, most of the existing schemes of keystroke analysis require having predefined typing models either for legitimate users or impostors. However, it is difficult or even impossible in some situations to have typing data of the users (legitimate or impostors) in advance. For instance, consider a personal computer that a user carries to a college or to a cafe. In this case, only the computer owner (legitimate user) is known in advance. For another instance, consider a computer that has a guest account in a public library; in this case, none of the system users are known in advance. Thus, a new automated and flexible technique that has the ability to authenticate the user without the need for any prior user typing model is needed. This dissertation focuses on improving continuous user authentication systems (that are based on keystroke dynamics) designed to detect malicious activity caused by another person (impostor) whose goals to is take over the active session of a valid user. The research will be carried out by1) studying the impact of the selected features on the performance of keystroke continuous authentication systems; 2) proposing new timing features that based on utilization of the most frequently used English words (e.g. “The”, “And”, For””) that can be useful in distinguishing between users in continuous authentication systems; 3) comparing the performance of keystroke continuous authentication systems with the application of different algorithms; 4) investigating the possibility of improving the accuracy of continuous user authentication systems by combining more than one feature; 5) proposing a new detector that does not require predefined typing models either from legitimate users or impostors.Item Evaluation of blockchain technology for cycle counting in supply chains with information discrepancies(2021-08) Padalkar, Nakul R.; Jones, Donald R.; Jin, Yuan; Sheikh-Zedah, Alireza; Song, JaekiBlockchain technology has been proven to reduce information discrepancies in the financial domain. Bitcoin’s success and scalability have pushed Blockchain into the limelight. As such, retail and supply chain firms like Merck and Walmart are investing in Blockchain. Blockchain can give unprecedented visibility to supply chain operations. Inventory Information Discrepancies (IDs) are a hindrance to such visibility. Industry reports and white papers have proclaimed the benefits and quantified values of Blockchain. To date, these claims are primarily based on educated guesses and have not been analyzed by rigorous model-based analysis. This dissertation argues that there may be several gaps in evaluating the value of Blockchain technology, of which some are bridged with solid model analysis for inventories with information discrepancies. The manager makes inventory inspection and replenishment decisions at the beginning of each period with the traditional method (manual) and Blockchain technology (automated with smart contracts). A single-period cost function is developed and extended to the multi-period problem. The objective is to determine if a Blockchain-smart contract inventory system with discrepancy will perform better than the traditional setting. The numerical study for the two systems shows that despite the discrepancy variability, demand volume, and the number of periods between cycle counts, the Smart Contract-based inventory system performs near-optimal and satisfies service level constraints better than the traditional systems. We also evaluate factors like holding costs, penalty costs, number of periods between cycle counts in conjunction with discrepancies and Blockchain. In particular, Blockchain performs significantly better when the ratio of system penalty cost to holding cost ratio is high. Also, allocating more cycle counts at the downstream partners will reduce the ill effects of discrepancies and reduce the bullwhip effect for a large supply chain. The analysis and insights generated from this study can be used to design guidelines or scorecard systems that help managers design better cycle-count policies with Blockchain. Furthermore, guidelines for practitioners for developing smart contracts are provided in the form of pseudocode. Finally, by comparing the costs associated with the two systems, the true value of accurate inventory information is quantified, which Blockchain systems may provide.Item Internet search engine result diversity, relevance, quality and the effects of search engine opitmization(Texas Tech University, 2006-12) Xing, Bo; Lin, Zhangxi; De Silva, Dakshina G.; Jones, Donald R.; Durrett, JohnDuring the last decade, Internet search engines have become a popular and increasingly effective technology both in daily information seeking and in the business world. Nowadays, however, there exists very limited research work in this area. Existing studies have primarily relied upon methods and theories originated from prior research in traditional Information Retrieval Systems. This dissertation is the first in-depth study of Internet search engines with an expanded view of an Information Retrieval System and its applications. On the one hand, it proposes a new framework of relevance assessment. On the other hand, it addresses the online advertising aspect of Internet search engines. Neither expansion has been adequately addressed in existing research in the domain of Internet search engines. The dissertation is composed of three independent studies compiled into chapter 2 through chapter 4, respectively. As the content on Internet continues to grow and new indexing technologies become available, Internet search engines are facing an increasing challenge of ranking search results. The impact of the diversity of search results on the performance of search engines has not been systematically examined. The first study aims to address this research issue by proposing a new framework of relevance assessment base on Information Attributes. With an analytical model, the impacts of both diversity and information seeking method have been identified. The second study empirically tests the findings of the first study. In addition, the study uses search engine quality as the dependent variable. With controlled experiments and a factorial design, the study confirms some of the findings of the first study. Inconsistent findings and implication have been discussed. The third study aims to answer the research question of "What is the impact of Search Engine Optimization (SEO) on the online advertising market". This study builds an economic model that reveals the impact of SEO on the market price of paid placement and the sustainability condition of SEO firms. Together, the three studies explore and build theories in several untapped areas of Information Retrieval and the online advertising market. The practical and theoretical implications have been discussed in the concluding chapter.Item Investment decisions in acquiring information security measures: An empirical investigation(2016-08) Safi, Roozmehr; Browne, Glenn J.; Jones, Donald R.; Song, Jaeki; Walden, EricAs a result of increased reliance on information technology, information systems and assets have become increasingly important for organizations and for individuals. Ensuring the security of these assets has therefore become a major concern, creating a huge demand for different types of security products and services. Gartner, for example, has forecasted that the worldwide expenditure on information security will increase from $77 billion in 2015 to about $108 billion in 2019, a 40% increase. In this environment, many decision makers struggle to determine how much they need to invest in security. The best answer can perhaps be obtained by applying quantitative risk analysis methods. However, applying these methods is often difficult and most decision makers rely on non-quantitative methods to decide on their security expenditure, introducing a considerable amount of subjectivity to the problem. In spite of this, research in this area has left an important empirical question underexplored: How efficient is a typical decision maker in allocating monitory resources to security given a certain security scenario? Also, a security solution is comprised of different types of security measures. Two important types of security measures are those geared toward prevention and those geared toward detection and response. A certain overall amount of security money can be allocated to these two types of security measures in different ways, resulting in different levels of overall effectiveness. Theory and real world evidence suggest that decision makers are not fully competent in allocating security budgets appropriately because they are biased toward funding prevention. Accordingly, an important question needing empirical investigation is: How does a typical decision maker structure his security investment? In summary, a main goal of this research is to determine empirically how efficient human decision makers are in allocating monetary resources to security when the key attributes of the risk environment as well the key attributes of the available risk mitigating measures are known. The other main goal is to investigate empirically how a certain security budget is allocated to prevention and to detection and response measures as two main classes of security products and practices. Results from this study indicate that a typical decision maker tends to react to small security risks by investing in security when no security investment is economically justified. A typical decision maker also tends to overreact to larger risks by overspending on security when a much smaller investment is needed, though the magnitude of this overinvestment as a percentage of the justified investment amount tends to decline for higher levels of risk. Interestingly, the absolute value of overinvestment in security as a percentage of asset value remains stable at around 13%, regardless of risk level. Decision makers in the study also demonstrated a bias by investing more in preventive security measures even though investing in detection and response yielded the same return on security investment. The magnitude of this prevention bias was quite high, ranging from 30% to 60%. Implications of the findings from this research are not limited to the theory and practice of information security, but also inform the theory and practice of security and safety in general.Item The effect of data structure and information presentation format on performance evaluation judgments using business performance management systems(2005-08) Peng, Chien-Chung; Viator, Ralph E.; Buchheit, Steve; Jones, Donald R.; Masselli, John J.Developments in accounting information systems technologies make various data presentation methods possible. However, system developers want to know how information presentation choices will affect judgments. The current study investigates how data hierarchy and information presentation format affect performance evaluation judgments. Data hierarchy is an important attribute for accounting information presentation. For example, the XBRL technology developed to improve the communication of accounting information presents accounting information hierarchically. Prior studies had examined the effect of presentation format on judgment, but none had specifically investigated how data hierarchy can influence accounting judgments. The current study relies on the theory of cognitive fit and cost-benefit analysis of cognition to develop a research model. The theory of cognitive fit argues that problem-solving performance is enhanced if information presentation matches the task type consistent with information processing strategies. In addition, cognitive psychology research suggests that the starting point used to drill down data hierarchies is an important factor in judgment of performance evaluation. A total of 107 subjects participated in a laboratory experiment. The experiment consisted of a 2x2x2 between-subject and within-subject design. Data hierarchy (hidden signal versus no-signal) and information presentation format (graphs versus tables) were between-subject factors. Task type (spatial versus symbolic) was manipulated as a within-subjects factor. Subjects were randomly assigned to four treatment cells and asked to perform two different tasks. The results indicate that subjects using data hierarchy without hidden signals of misleading solutions performed significantly better than subjects in the data hierarchy with hidden signals. In addition, subjects performed better when information presentation format matched task type. Finally, additional analysis suggests that the effect of graphical presentation of accounting information for spatial tasks may be dependent on how information is organized. The performance advantage of graphical display over tabular format for spatial tasks disappears when the data hierarchy does not contain hidden signals at an aggregate level. This study extends the extant literature on information display research. It provides a more complete understanding of cognitive fit by examining an important aspect of problem-solving element, data hierarchy, and how it interacts with information presentation format and task type.Item Understanding DevOps: From its enablers to impact on IT performance(2019-08) Cogo, Gabriel Silva; Jones, Donald R.; Song, Jaeki; Aguirre-Urreta, MiguelPrevious research has addressed the importance of the process of delivering software as vital for the survival of Information Technology (IT) organizations. Yet, despite this valuable work, academia has struggled to keep track of constant and profound changes in the way in which software is developed and delivered to the customers. DevOps has arisen as an evolution of decades of work and improvement of software delivering processes, methodologies, and mostly philosophies, and has fundamentally changed the way IT organizations are supposed to function. While it has led to significant improvements in overall the performance outcomes of IT companies, both organizations and academia lack deeper knowledge of what it is, how it works, and if it can actually lead to improved IT performance. This research’s model draws on previous research on Technological and Management capabilities, and research on IT Culture, to draw on the enabling factors of DevOps, as it proposes the factor of Delivery Approach to link the gap between these overarching factors and IT department’s actual outcomes. This research also proposes an ideal DevOps organization’s values, which when aligned with their Delivery Approach, has shown a significant effect on IT Outcomes. Survey data was collected using IT professionals, mostly in managerial positions, with previous DevOps knowledge. 176 data responses were collected in the U.S. The model shows a significant positive enabling correlation between Culture and Management in an organization’s Delivery Approach, and that Delivery Approach, mediated by an alignment with DevOps, has a strong effect on IT Outcomes. This research makes contributions both to factor and measure items in this domain, it also contributes to theory base by helping build a still young literature in DevOps and in Software Delivery in general, which has long been recognized as a vital process of any IT organization’s goals.Item Understanding the effects of textual representational alignment on user search and stopping behavior(2016-08) Lucus, David Jon; Browne, Glenn J.; Jones, Donald R.; Song, Jaeki; Walden, Eric A.User reviews have become a standard source of textual information that can be accessed during an information search. Prior research has identified several content-related attributes of user reviews (review length, extremeness, etc.) that make user reviews helpful to users; however, this research has neglected how this information is consumed by the user. Psychological research has suggested that different textual representations (narrative or expository) are consumed differently by users based on the users’ domain knowledge. In the present research, an experiment was conducted to examine how a user’s search pattern (breadth-of-search and depth-of-search) and his information search termination pattern are affected by different alignments between the textual representations (narrative and expository) and the user’s domain knowledge. A search product (digital cameras) and an experience product (music compact disk) were tested. Findings suggest that if a user’s domain knowledge is aligned correctly with a textual representation (High Domain Knowledge/Expository or Low Domain Knowledge/Narrative) he will have a deeper and broader search than a user who is misaligned (High Domain Knowledge/Narrative or Low Domain Knowledge/Expository). Limitations and areas for future research are discussed.Item Volunteers' participative behaviors in open source software development: The role of extrinsic incentive, intrinsic motivation and relational social capital(2006-05) Xu, Bo; Jones, Donald R.; Westfall, Peter H.; Lin, Zhangxi; McDonald, Robert E.Open source software is a revolution in software development and represents a new mode of software distribution. The widespread diffusion of Internet access in the early 1990s led to a dramatic acceleration of open source activity. Compared with traditional software development, open source software development is informally organized, loosely structured and lacks the formal control mechanisms used in traditional software development. The success of an open source project depends on the participation of voluntary developers. Currently the research on open source software is focused on understanding the motivations or incentives for open source participation at general level. There has been very little research about the influence of community characteristics on the participants’ behaviors. To better understand the motivations of open source participation, this study integrates incentive factor (reputation gaining, personal software needs, and learning purpose), intrinsic motivational factor (enjoyment) and social relational factors (identification and obligation) to see how these factors impact participation level in open source communities, and how they work together and complement each other. An empirical study using Web survey methodology was conducted to test the research model. Data were collected from voluntary developers in many open source projects. Using the Structural Equation Modeling (SEM) method, data analysis showed that most of the research hypotheses were supported. The research findings show that the relational social factors play very important role in motivation of open source project participation. Professional benefits and enjoyment also have influence on participation. Virtual community quality is critical to the success of open source software development, and voluntary developers’ participation can be promoted through community building, including member selection, goal congruence, promotion of interpersonal relationships, and providing work-related and emotional support. The research findings provide theoretical contribution to open source software research, and practical implications for open source project management.