Technical Reports

 

Return to Technical Report System

USC-CSE-2005-519PDF

 

Recovering Architectural Views of OO Systems


Vladimir Jakobac, Advisor: Nenad Medvidovic

In this paper, we present an iterative, user-guided approach to recovering architectural information of OO legacy systems based on a framework for analyzing and visualizing software systems. The framework is built around a pluggable and extensible set of clues and rules about a given problem domain, execution environment, and/or programming language. Architecture-relevant information is recovered by providing several architectural views. While the purpose view captures the high-level functionality of the system elements, the usage view identifies the regions of related elements. These views are then used as input to the phase which results in high-level structure and interaction views of the system. We have developed an implementation prototype of our framework targeted at Java systems. The tool is integrated with IBM Rational Rose®.

All right reserved (c) by authors.

added: Dec. 12, 2005


USC-CSE-2005-518PDF

 

SimVBSE: Developing a Game for Value-Based Software Engineering

 

Apurva Jain and Barry Boehm

The development of games in aid of improving and enriching a student's learning experience is again on the rise. The beer game [6] in the field of system dynamics was developed to instill the key principles of production and distribution. SimSE [5] provides a simulated game for its players to take on the role of a project manager, and experience the fundamentals of software engineering through cause-effect models. In this paper we present an initial design of SimVBSE as a game for students to better understand value-based software engineering [1] , and its underlying theory [3] .

Submitted to CSEET 2006. All right reserved (c) by authors.

added: Nov. 15, 2005


USC-CSE-2005-517PDF

 

Finding the Right Data for Software Cost Modeling

Zhihao Chen Boehm, B. Menzies, T. Port, D.

Strange to say, when building a software cost model, sometimes it's useful to ignore much of the available cost data. One way to do this is to perform data-pruning experiments after data collection and before model building. Experiments involving a set of Unix scripts that employ a variable-subtraction algorithm from the WEKA (Waikato Environment for Knowledge Analysis) data-mining toolkit illustrate this approach's effectiveness. This article is part of a special issue on predictor modeling.

 

Document characteristics: Software, IEEE
Publication Date: Nov.-Dec. 2005
Volume: 22 , Issue: 6
On page(s): 38 - 46

added: Nov. 11, 2005


USC-CSE-2005-516PDF

 

Management challenges to implementing agile processes in traditional development organizations

Boehm, B. Turner, R.

 

Discussions with traditional developers and managers concerning agile software development practices nearly always contain two somewhat contradictory ideas. They find that on small, stand-alone projects, agile practices are less burdensome and more in tune with the software industry's increasing needs for rapid development and coping with continuous change. Managers face several barriers, real and perceived, when they try to bring agile approaches into traditional organizations. They categorized the barriers either as problems only in terms of scope or scale, or as significant general issues needing resolution. From these two categories, we've identified three areas - development process conflicts, business process conflicts, and people conflicts - that we believe are the critical challenges to software managers of large organizations in bringing agile approaches to bear in their projects.

 

 

Document characteristics: Software, IEEE
Publication Date: Sept.-Oct. 2005
Volume: 22 , Issue: 5
On page(s): 30 - 39

added: Nov. 11, 2005


USC-CSE-2005-515PDF

 

Value - based processes for COTS - based applications

Yang, Y. Jesal Bhuta, Boehm, B. Port, D.N.

 

Economic imperatives are changing the nature of software development processes to reflect both the opportunities and challenges of using COTS products. Processes are increasingly moving away from the time-consuming composition of custom software from lines of code (although these processes still apply for developing the COTS products themselves) toward assessment, tailoring, and integration of COTS or other reusable components. Two factors are driving this change: COTS or other reusable components can provide significant user capabilities within limited costs and development time, and more COTS products are becoming available to provide needed user functions.

 

Document characteristics: Software, IEEE
Publication Date: July-Aug. 2005
Volume: 22 , Issue: 4
On page(s): 54 - 62

added: Nov. 11, 2005


 

USC-CSE-2005-514PDF

 

Assessing COTS Integration Risk Using Cost Estimation Inputs

 

Ye Yang, Barry Boehm, Betsy Clark

 

Most risk analysis tools and techniques require the user to enter a good deal of information before they can provide useful diagnoses. In this paper, we describe an approach to enable the user to obtain a COTS glue code integration risk analysis with no inputs other than the set of glue code cost drivers the user submits to get a glue code integration effort estimate with the COnstructive COTS integration cost estimation (COCOTS) tool. The risk assessment approach is built on a knowledge base with 24 risk identification rules and a 3-level risk probability weighting scheme obtained from an expert Delphi analysis. Each risk rule is defined as one critical combination of two COCOTS cost drivers that may cause certain undesired outcome if they are both rated at their worst case ratings. The 3-level nonlinear risk weighting scheme represents the relative probability of risk occurring with respect to the individual cost driver ratings from the input. Further, to determine the relative risk impact, we use the productivity range of each cost driver in the risky combination to reflect the cost consequence of risk occurring. We also develop a prototype called COCOTS Risk Analyzer to automate our risk assessment method. The evaluation of our approach shows that it has done an effective job of estimating the relative risk levels of both small USC e-services and large industry COTS-based applications.

 

 

Document characteristics:

added: Nov. 11, 2005


 

USC-CSE-2005-513PDF

 

Achievements and challenges in software resources estimation

 

Barry Boehm, Ricardo Valerdi

 

Submitted to ICSE 2006

 

Document characteristics:

added: Nov. 11, 2005


 

USC-CSE-2005-512PDF

 

Empirical Results from an Experiment on Value-Based Review (VBR) Processes

Keun Lee, Barry Boehm

 

As part of our research on value-based software engineering, we conducted an experiment on the use of value-based review (VBR) processes. We developed a set of VBR checklists with issues ranked by successcriticality, and a set of VBR processes prioritized by issue criticality and stakeholder-negotiated product capability priorities. The experiment involved 28 independent verification and validation (IV&V) subjects (full-time working professionals taking a distance learning course) reviewing specifications produced by 18 real-client, full-time student e-services projects. The IV&V subjects were randomly assigned to use either the VBR approach or our previous valueneutral checklist-based reading (CBR) approach. The difference between groups was not statistically significant for number of issues reported, but was statistically significant for number of issues per review hour, total issue impact, and cost effectiveness in terms of total issue impact per review hour. For the latter, the VBRs were roughly twice as cost-effective as the CBRs.

 

ISESE 2005

 

Document characteristics:

added: Nov. 11, 2005


 

USC-CSE-2005-511PDF

 

Value-based Feedback in Software and Information Systems Development

LiGuo Huang, Barry Boehm

 

The role of feedback control in software and information system development has traditionally focused on a milestone plan to deliver a pre-specified set of capabilities within a negotiated budget and schedule. One of the most powerful approaches available for controlling traditional software projects is called the Earned Value system. However, the Earned Value Management process is generally good for tracking whether a project is meeting its original plan. It becomes difficult to administer if the project plans change rapidly. And more significantly it has absolutely nothing to say about the actual value being earned for the organization by the results of the project.
This chapter begins by summering a set of four nested feedback and feedforward loops that have been successfully used to scope, estimate, control, and improve the predictability and efficiency of software development and evolution. It then proposes an alternative approach for project feedback control. It focuses on the actual stakeholder value likely to be earned by completing the project. And a framework is provided for monitoring and controlling value in terms of a Benefits Realization Approach [Thorp, 1998] and business case analysis. An order processing system is used as an example to illustrate the value-based feedback control mechanisms. At the end of this chapter, it presents the conclusions and directions for future research and development.

 

Software Evolution and Feedback

 

Document characteristics:

added: Nov. 11, 2005


 

 

USC-CSE-2005-510 PDF

 

How Much Software Quality Investment is Enough: A Value-Based Approach

 

LiGuo Huang, Barry Boehm

 

A classical problem facing many software projects is how to determine when to stop testing and release the product for use. We have found that risk analysis helps to address such "how much is enough?" questions, by balancing the risk exposure of doing too little with the risk exposure of doing too much. However, people have often found it difficult to quantify the relative probabilities and sizes of loss in order to provide practical approaches for determining a risk-balanced "sweet spot" operating point.

We provide a quantitative approach based on the COCOMO® II cost estimation model and the COQUALMO quality estimation model to help project decision-makers determine "how much software quality investment is enough?" We also provide examples of its use under differing value profiles. Further, we use the models and some representative empirical data to assess the relative payoff of value-based testing as compared to value-neutral testing.

To appear in IEEE Software

Document characteristics:

added: Nov. 10, 2005


 

USC-CSE-2005-509PDF

COCOMO® Suite Methodology and Evolution

Dr. Barry Boehm, Ricardo Valerdi, Jo Ann Lane, and A. Winsor Brown

Over the years, software managers and software engineers have used various cost models such as the Constructive Cost Model (COCOMO®) to support their software cost and estimation processes. These models have also helped them to reason about the cost and schedule implications of their development decisions, investment decisions, client negotiations and requested changes, risk management decisions, and process improvement decisions. Since that time, COCOMO® has cultivated a user community that has contributed to its development and calibration. COCOMO® has also evolved to meet user needs as the scope and complexity of software system development has grown. This eventually led to the current version of the model: COCOMO® II.2000.3. The growing need for the model to estimate different aspects of software development served as a catalyst for the creation of derivative models and extensions that could better address commercial off-the-shelf software integration, system engineering, and system-of-systems architecting and engineering. This article presents an overview of the models in the COCOMO® suite that includes extensions and independent models, and describes the underlying methodologies and the logic behind the models and how they can be used together to support larger software system estimation needs. It concludes with a discussion of the latest University of Southern California Center for Software Engineering effort to unify these various models into a single, comprehensive, user-friendly tool.

CrossTalk, April 2005

Document characteristics:

added: Nov. 10, 2005


 

 

USC-CSE-2005-508 PDF

System of Systems Lead System Integrators: Where do They Spend Their Time and What Makes Them More/Less Efficient? - Background for COSOSIMO

Jo Ann Lane

As organizations strive to expand system capabilities through the development of system-of-systems (SoS) architectures, they want to know "how much effort" and "how long". In order to answer these questions, it is important to first understand the types of activities performed in SoS architecture development and integration and how these vary across different SoS implementations. This paper provides preliminary results of research conducted to determine types of SoS Lead System Integrator (LSI) activities and how these differ from the more traditional system engineering activities described in EIA 632 (Processes for Engineering a System). It also looks at concepts in organizational theory, complex adaptive systems, and chaos theory and how these might be applied to SoS LSI activities to improve success rates and efficiency in the development of these "very large" complex systems.

Document characteristics:

added: Nov. 10, 2005


 

USC-CSE-2005-507 PDF

 

The Future of Software and Systems Engineering Processes

 

Barry Boehm

 

In response to the increasing criticality of software within systems and the increasing demands being put onto software-intensive systems, software and systems engineering processes will evolve significantly over the next two decades. This paper identifies eight relatively surprise-free trends - the increasing interaction of software engineering and systems engineering; increased emphasis on users and end value; increasing software criticality and need for dependability; increased emphasis on systems and software dependability; increasingly rapid change; increasing global connectivity and need for systems to interoperate; increasing IT globalization and need for interoperability; increasingly complex systems of systems; increasing needs for COTS, reuse, and legacy systems and software integration; and computational plenty. It also identifies two "wild card" trends: in-creasing software autonomy and combinations of biology and computing. It then discusses the likely influences of these trends on software and systems engineering processes between now and 2025, and presents an emerging three-team adaptive process model for coping with the resulting challenges and opportunities of developing 21st century software-intensive systems and systems of systems.

 

Document characteristics:

added:  June 29, 2005

 


 

USC-CSE-2005-506 PDF

 

A Generic Approach for Estimating the Energy Consumption of Component-Based Distributed Systems

 

Chiyoung  Seo, Sam Malek, and Nenad Medvidovic

 

In distributed software systems, each software component interacts with other components in order to provide users with various services. Recently, portable devices (e.g. PDA) with wireless network capabilities have been widely being used in building Java-based distributed software systems. However, these portable devices generally suffer from limited battery power. Since each device has a different battery capacity and each software component consumes different power, the initial deployment of software components over comprising devices is no longer appropriate with respect to the duration of services as the batteries of the devices are being drained. Therefore, it is necessary to redeploy software components over mobile devices during the runtime by considering each component's energy consumption and the remaining battery capacity of each device in order to increase the lifetime of all the services provided by distributed software components. As a part of our work toward this goal, we suggest the generic approach for estimating the energy consumption of Java-based software components, which can be easily applied to heterogeneous devices. Through the extensive experiments, we show that our estimation model is generic and highly accurate compared with the actual energy consumption.

 

 

Document characteristics:

added:  April 4, 2005

 


USC-CSE-2005-505 PDF

 

An Initial Theory of Value-Based Software Engineering

 

Barry Boehm, Apurva Jain

 

This chapter presents an initial “4+1” theory of value-based software engineering (VBSE). The engine in the center is the stakeholder win-win Theory W, which addresses the questions of “which values are important?” and “how is success assured?” for a given software engineering enterprise. The four additional theories that it draws upon are utility theory (how important are the values?), decision theory (how do stakeholders’ values determine decisions?), dependency theory (how do dependencies affect value realization?), and control theory (how to adapt to change and control value realization?). After discussing the motivation and context for developing a VBSE theory and the criteria for a good theory, the chapter discusses how the theories work together into a process for defining, developing, and evolving software-intensive systems. It also illustrates the application of the theory to a supply chain system example, discusses how well the theory meets the criteria for a good theory, and identifies an agenda for further research.

 

Document characteristics:

added: March 31, 2005

 


USC-CSE-2005-504 PDF

 

Value-Based Software Engineering: Overview and Agenda

 

Barry Boehm

 

Much of current software engineering practice and research is done in a value-neutral setting, in which every requirement, use case, object, test case, and defect is equally important. However most studies of the critical success factors distinguishing successful from failed software projects find that the primary critical success factors lie in the value domain.


Document characteristics:

added: March 31, 2005

 


USC-CSE-2005-503 PDF

 

Value-Based Software Engineering: Seven Key Elements and Ethical Considerations

 

Barry Boehm

 

This chapter presents seven key elements that provide candidate foundations for value-based software engineering:

1. Benefits Realization Analysis

2. Stakeholder Value Proposition Elicitation and Reconciliation

3. Business Case Analysis

4. Continuous Risk and Opportunity Management

5. Concurrent System and Software Engineering

6. Value-Based Monitoring and Control

7. Change as Opportunity

Using a case study, it then shows how some of these elements can be used to incorporate ethical considerations into daily software engineering practice.


Document characteristics:

added: March 31, 2005

 


USC-CSE-2005-502 PDF

 

Value-Based Verification and Validation Guidelines

 

Keun Lee, Monvorath Phongpaibul, Barry Boehm

 

The USC Center for Software Engineering’s Value-Based Software Engineering agenda involves experimentation with value-based reformulations of traditional value-neutral software engineering methods. The experimentation explores conditions under which value-based methods lead to more cost-effective project outcomes, and assesses the degree of impact that value-based methods have on the various dimensions of project outcomes. Examples of areas in which value-based technical have shown improvements in cost-effectiveness have included stakeholder win-win requirements determination, use of value-based anchor point milestones, use of prioritized requirements to support schedule-as-independent variable development processes, and the use of risk management and business case analysis to support value-based project monitoring and control.


Document characteristics:

added: March 31, 2005

 


USC-CSE-2005-501 PDF

 

Software Engineering Graduate Project Effort Analysis Report

 

Zhihao Chen

 

In the graduate courses - CSCI577ab, graduate students apply software engineering methodologies, processes, procures, and models to manage software development. This report analyzes their activities and effort distribution from fall 2001 to spring 2004. It would be very helpful for students to appropriately arrange their time, manage their schedule and plan their projects. It would also very helpful for software engineering research.

 


Document characteristics:

added: January 25, 2005

 


USC-CSE-2005-500 PDF

 

Software Process Disruptors, Opportunity Areas, and Strategies

 

Barry Boehm

 

The near future (5-10 years) of software processes will be largely driven by disruptive forces that require organizations to change their traditional ways of doing business. This report begins with a discussion of the major current and near-future disruptors in the software process area and how they interact. It then discusses major trends in terms of opportunity areas for dealing with various combinations of disruptors. Based on the opportunity areas, it then identifies some attractive future strategies that appear to have high payoff.


Document characteristics:

added: January 6, 2005

 


 

 

Return to Technical Report System

Copyright 2004, 2005 The University of Southern California

The written material, text, graphics, and software available on this page and all related pages may be copied, used, and distributed freely as long as the University of Southern California as the source of the material, text, graphics or software is always clearly indicated and such acknowledgement always accompanies any reuse or redistribution of the material, text, graphics or software; also permission to use the material, text, graphics or software on these pages does not include the right to repackage the material, text, graphics or software in any form or manner and then claim exclusive proprietary ownership of it as part of a commercial offering of services or as part of a commercially offered product.