COMPUTING, SOFTWARE, AND NEXT DECADE: 60, 30, AND 10

30th Annual International Computer Software and Applications Conference


Chicago, September 17-21, 2OO6

TUTORIALS


COMPSAC 2006 Tutorials

IEEE COMPSAC 2006 offers 2 full-day and 6 half-day tutorials as follows. You may also register to attend the tutorials offered by ICWS/SCC. Three conferences (Compsac, ICWS, and SCC) and co-located in the 2006 Congress (http://conferences.computer.org/costep). Your conference registration / badge with Compsac will permit you to enter the ICWS/SCC area.

Register for tutorials now!

Sunday, 17 September, 2006 9:30AM - 12:30PM     Cancelled T3 - Model Checking (Lenore Zuck & Ken McMillan)
13:30PM - 17:00PM Cancelled T1 - Software Requirements from Folklore to Engineering Analysis (Larry Bernstein) Cancelled T2 - Identity Management Systems (Kal Toth)
Monday, 18 September, 2006 9:30AM - 13:00PM T4 - Testing in a Quasi-Agile Software Development Environment (Timothy D. Korson) T6 - Best Practices for Software Quality Specification, Testing & Certification of COTS & Bespoken (Hans-Ludwig Hausen) T5 - An Introduction to Computer Forensics (Warren Harrison)
13:30PM - 17:00PM Cancelled T7 - Introduction to Lean Six Sigma (Kenneth D. Shere) T8 - Testing Object-Oriented and Web-Based Applications (David C. Kung)

Tutorial 1: Software Requirements from Folklore to Engineering Analysis (Cancelled)
9/17/2006 (Sunday)
13:30-17:00 pm

Lawrence Bernstein
Stevens Institute of Technology
lbernstein@ieee.org

Abstract:

Only sixty percent of a software system's eventual features are known when the requirements are complete. This is a major source of project failures. Changes in business needs and the discovery of emerging needs too often cause projects to be late or overrun their cost projections. This tutorial provides a way to systematically analyze software application features and functions so that engineering trade-offs and priorities may be set during the requirements engineering phase of a project. The tutorial presents a process that integrates prototyping, model driven development, a simplified approach to Quality Function Deployment, ICED-T, and the COCOMO Model along with a new approach to reliability. These functions are performed during the requirements engineering phase and then repeated during each subsequent phase. The results of these analyses provide the data needed to make schedule, cost, reliability and feature tradeoffs. These quantitative tradeoffs add engineering tradeoffs to traditional requirements synthesis from qualitative customer interactions. Case histories are used to illustrate the approach so that the software professional can apply the techniques directly and so that software engineering instructors can teach a quantitative analysis for requirements engineering.

About the instructor:

Lawrence "Larry" Bernstein is a professor and a software management consultant. He is a seasoned executive having spent 35 years with Bell Laboratories. He directs the Quantitative Software Engineering Masters Degree program at Stevens Institute of Technology. He is expert in software technology, project management, and technology conversion. He teaches graduate courses on Software Engineering. In 2004 he helped five companies improve their use of technology. He has recently published a book and is a keynote speaker at professional meetings.

He had a 35-year distinguished career at Bell Laboratories in managing large software projects. At Bell Labs he became a Chief Technical Officer of the Operations Systems Business Unit and an Executive Director. In parallel with these Bell Labs positions he was the Operations Systems Vice President of AT&T Network Systems from 1992-1996. Larry holds one patent for logic design, one for systems design and seven for software innovations. He championed research into software fault tolerance and demonstrated its commercial applications to the extent that it is now used in 24 products deployed in more than 500 sites to improve software system reliability.

He is a Fellow of the Institute of Electrical and Electronics Engineers, Inc. (IEEE) and a Fellow of the Association for Computing Machinery. He is a member of the Russian Information Academy; a visiting Associate of University of Southern California's Center for Software Engineering and an Industrial Fellow of Ball State Center for Information and Communication Sciences. He is a member of the honor societies Tau Beta Pi and Eta Kappa Nu. He was awarded the coveted Bell South "Eagle" for seminal contributions to their automatic service provisioning systems. Larry Bernstein holds a Masters in Electrical Engineering from NYU and Bachelor's degree in E.E. from RPI.


Tutorial 2: Identity Management Systems -- -- Instructor: Kal Toth (Cancelled)
9/17/2006 (Sunday)
13:00 pm-17:00 pm

Kal Toth, Ph.D.
Portland State University
ktoth@cs.pdx.edu

Abstract:

The problem of managing multiple identities, authentication schemes, and authorities has escalated dramatically in the face of burgeoning system complexity. Increased demand for mobile information access, more flexible sharing of user terminals, and enhanced organizational interoperability have certainly contributed to this problem. In the absence of well-articulated and easy-to-use identity management systems, IT departments will continue to build ad hoc solutions for bridging identities and access controls. Emerging integrated identity and credential management systems, single-sign-on (SSO) and XML-based standards are beginning to address this critical problem.

When accessing web-based information in next-generation law enforcement systems, end-users will employ both fixed and mobile terminal devices including desk-top PCs, vehicle-mounted data terminals, and hand-held wireless cell phones and PDAs. Some of these devices will be dedicated to a given user, but many will be shared. Meanwhile, organizations are increasingly motivated to make their sensitive information and services available beyond their firewalls to enhance field operations and collaboration.

Clearly, the demand for more diverse fixed and mobile terminal usage, terminal sharing, and cross-agency information sharing will significantly exacerbate end-user identification, authentication, credential issuing, permission checking, and overall security policy administration.

Several current and emerging identity management systems address various aspects of this problem. Each has its benefits and limitations. This tutorial evaluates and compares the most notable of these architectures technologies in the context of the identity and credential management problem. This tutorial also explores alternatives and future directions in this interesting and challenging area.

About the instructor:

Kal Toth is an Associate Professor in the College of Engineering and Computer Science at Portland State University in Portland Oregon. He is also the Associate Director of the Oregon Master of Software Engineering (OMSE) program and is teaching graduate and undergraduate software engineering courses in this program. He conducts research in the field of information security, specifically in the area of identity and electronic credential management. He has a Ph.D. in Computer Systems Engineering from Carleton University (Ottawa, Canada). A long time member of the Association of Professional Engineers and Geoscientists of British Columbia (APEGBC), he has a P.Eng. with a software engineering designation.

Kal Toth is a member of Portland State University's Center for Information Assurance. He also is an editor and a regular contributor (5 articles to date) for the Software Association of Oregon's Cursor newsletter. He conducts seminars in the fields of information security, software engineering and software project management. He has conducted seminars on Cyber Security for APEGBC, National Defence Canada, and MacDonald Dettwiler & Assoc. Kal Toth has been conducting research in the field of information security as it relates to using personally held identity and electronic credentials to access and share sensitive information. His work of the "Persona Concept" has focused on the challenges of securely sharing common terminals among users, sharing security tokens across different terminal types (PC, vehicle-mounted terminals, cell phones and PDAs), and supporting interoperability among collaborating organizations. An important part of his Persona Concept research has been to examine the benefits and limitations of existing and emerging identity management systems, contrasting them with the Persona Concept.

Before joining academia, Kal practiced and consulted in the systems and software industry for over 25 years. He led and supported several systems development and technology consulting teams addressing the security properties, requirements and implementations of systems and products for many of those years. Relevant projects he has completed include: the development of an embedded crypto product for e-business applications; 3rd party verification and validation of a distributed secure information system network connecting Canada~s embassies abroad (while with CGI Group Inc.); the analysis and design of several secure network gateways and interfaces for National Defense Canada; and the analysis of the security networking and product requirements for Canada's new Air Traffic Control system (while with Hughes Aircraft).


Tutorial 3: Model-Checking -- -- Instructors: Lenore Zuck and Ken McMillan (Cancelled)
9/17/2006 (Sunday)
9:00-12:30
12:30-13:30 (lunch)
13:30-17:00

Lenore Zuck, Ph.D.
University of Illinois at Chicago
lenore@cs.uic.edu

Ken McMillan, Ph.D.
Cadence Berkeley Labs
mcmillan@cadence.com


Abstract:

Model checking was proposed in the 1980s as an approach for formal verification of finite-state systems against temporal logic specifications. Nowadays, model checking is used in formal verification of both hardware and software systems. The tutorial will describe the method and its applications, starting at the basic underlying ideas, leading to the enhancements that has led the technique to reach its current popularity in verifying hardware as well as software systems The tutorial will describe and demonstrate several model checking tools.

About the instructors:

Lenore Zuck teaches at the University of Illinois at Chicago, where she also conducts research in formal methods. She has led the Translation Validation team at NYU. She has also actively participated in industrial projects that implement translation validation at both Intel and Microsoft. Before joining UIC, Lenore held faculty positions at Yale and at NYU. She holds a BS in computer science from the Technion (1979) and a MS and Ph.D in computer science from the Weizmann Institute of Science.

Ken McMillan is a fellow at Cadence Berkeley Labs, where he develops tools and algorithms for formal verification of both hardware and software. He is the creator of SMV, the first symbolic model checking system. Ken holds a BS in electrical engineering from the University of Illinois at Urbana (1984), an MS in electrical engineering from Stanford (1986) and a Ph.D. in computer science from Carnegie Mellon (1992). Before his Ph.D. studies, he worked as a chip designer and a biomedical engineer. He lives in Berkeley, California.


Tutorial 4: Testing in a Quasi-Agile Software Development Environment -- -- Instructor: Timothy D. Korson
9/18/2006 (Monday)
9:30-13:00

Timothy D. Korson, Ph.D.
Korson Consulting
San Jose, CA USA
tim@korson-consulting.com

Abstract:

This tutorial focuses on practical issues faced by increasing numbers of testers today. These issues arise from the fact that most test organizations are still structured around traditional software development practices even though many software development teams are heading full steam into modern agile software development techniques. QA managers trying to encourage best practices as recommended by CMMI and SPICE find themselves at odds with developers trying to adopt best practices as recommended by the Agile Manifesto. This leaves corporate QA stuck coping with an organizational and technical paradigm shift that traditional QA policies and practices are inadequate to handle. In the highly iterative environment characteristic of these agile development projects, development and testing processes are much more tightly integrated. System testers are expected to test early immature increments of the software, and are often called upon to plan, support and review the unit and component-level testing process. Developers, in addition to unit testing, may be called upon to assist with the automation of certain system-level tests. Risk assessment and overall test asset allocation must also be adapted.

The attendee will learn to integrate development and testing processes according to best current software engineering practices. Attendees learn how to create and execute effective tests at all levels and for all development phases for modern software systems. The presentation covers organizational issues for the testing process that are introduced by the aggressive iterative, incremental nature of agile software development projects. Specific testing techniques that are covered include incrementally deriving system test cases from requirements as well as ways to exploit the well specified interfaces of components. In addition to the discussion of techniques and best practices, this tutorial addresses how to adapt, survive, and hopefully even thrive in mixed culture environments, where the developers are coming from an agile mindset, but some or all of the stakeholders, managers, testers, and others in the organization are coming from a traditional mindset.

About the instructor:

Timothy Korson has well over a decade of substantial experience working on a large variety of systems developed using modern software engineering techniques. This experience includes distributed, real time, embedded systems as well as business information systems in an n-tier, client-server environment. Korson's typical involvement on a project is as a senior management consultant with additional technical responsibilities to ensure high quality, robust test and quality assurance processes and practices. Korson has authored numerous articles, and co-authored a book on Object Technology Centers. He has given frequent invited lectures at major international conferences and has contributed to the discipline through original research. The lectures and training classes he presents are highly rated by the attendees.

Tutorial 5: An Introduction to Computer Forensics -- -- Instructor: Warren Harrison
9/18/2006 (Monday) (Total 6 hours)
9:30--12:30
12:30--13:30 (lunch)
13:30--17:30

Warren Harrison, Ph.D.
Portland State University
warren@cs.pdx.edu

Abstract:

Long the domain of law enforcement, computer forensics is beginning to enter the mainstream of computing sciences as digital devices increasingly become a ubiquitous part of daily life. Unlike many fields within the computing domain, advances are not so much limited by technology as they are by the artificial constraints imposed by statutory and constitutional limitations. This tutorial will introduce participants to those limitations, discuss the principles and practices of contemporary computer forensics, and explore the current and future challenges for software technologists working in this space.


About the instructor:

Warren Harrison is a Professor of Computer Science at Portland State University and current Editor-in-Chief of IEEE Software Magazine. He has also served as Editor-in- Chief of the Software Quality Journal (1999-2001) and founding co-EIC (with Vic Basili) of the Empirical Software Engineering Journal (1995-2002). He has been active in the digital forensics and law enforcement communities for over five years. He served as a Police Reserve Specialist in Computer Crime and Digital Forensics with the Hillsboro Oregon Police Department from 2002-2004, and is a sworn Reserve Deputy Sheriff with the Clackamas County Oregon Sheriff~s Office since 2004 where he is currently assigned to the Patrol Division. He has been a member of the program committees of the annual Digital Forensics Research Workshop, the IFIP WG 11.9 International Conference on Digital Forensics, and the Computer Forensics track of the ACM Symposium on Applied Computing, as well as being a member of the International Board of Referees of the Digital Investigation Journal published by Elsevier Scientific Publishers. He has previously held positions with Bell Telephone Laboratories and Lawrence Livermore National Laboratory. He is the author of over 60 articles, papers and book chapters in software engineering, computer security and computer forensics. He holds a B.S. in Accounting from the University of Nevada, an M.S. in Computer Science from the University of Missouri-Rolla, and a Ph.D. in Computer Science from Oregon State University.

Tutorial 6: Best Practices for Software Quality Specification, Testing and Certification -- -- Instructor: Hans-Ludwig Hausen
9/18/2006 (Monday)
9:30-13:00

Hans-Ludwig Hausen, Ph.D.
Fraunhofer Institute, St. Augustin, Germany
hausen@fit.fraunhofer.de

Abstract:

The seminar will cover the principles and the normative quality characteristics as well as the standardized procedures of information quality assurance resp. software system quality assurance (comprising V&V, test, measurement and assessment) for procedural, object-oriented and agent-based dependable software systems.

Attendees will exercise proven techniques for goal-directed measurement, scaling and assessment for software certification. Assessment of both the software product as well as the software process will be discussed with respect to its relevance for such acceptance assessments.

A standardized process model for measurement, assessment and certification of dependable software will be used to make the attendees familiar with this comprehensive assessment procedure and to learn how to embed it into today's standardized or non-standardized software processes.

Basic knowledge in mathematics and some knowledge of software methods and tools are required. Emphasis will be given to selected advanced topics depending on the needs of participants.

About the instructor:

Hans-Ludwig Hausen holds degrees in Electrical Engineering from the University of Wuerzburg/Schweinfurt and in Computer Science from the Technical University of Berlin. He is currently a Principal Senior Researcher at Fraunhofer HLH and an experienced project manager, consultant and lecturer in the following areas: Dedicated and general Information Systems (IS), Computer Aided Software Engineering (CASE), Computer Supported Collaborative Work (CSCW), Software and Systems Quality Engineering (SQE), Business Process Engineering (BPE), as well as in Conformance Testing and Certification (CTC) for national and international projects in a number of application domains (office and embedded systems, archive and library systems, healthcare systems). Based on that experience he has written more than 120 reviewed publications on information storage and retrieval systems, software engineering environments, software quality and productivity, process engineering and on teamware.


Tutorial 7: Introduction to Lean Six Sigma -- Instructor: Kenneth D. Shere (Cancelled)
9/18/2006 (Monday)
13:30-17:00

Kenneth D. Shere, Ph.D.
The Aerospace Corporation
shere@aero.org

Abstract:

Lean six sigma (LSS), as used by consulting companies today, represents the cumulative knowledge of over eighty years of process improvement methodology. It takes into account methods used for statistical quality control, total quality management, business process reengineering, lean manufacturing and six sigma (as originally developed by Motorola in 1986).

This tutorial will be interactive and include exercises. The attendees will gain an understanding of LSS thinking and a top-level understanding of what is LSS. We will discuss who uses LSS in both government and industry, and compare LSS to other process improvement methods.

In addition to gaining a high level understanding of LSS, an approach to determine the cost of implementing six sigma and the expected return on investment will be provided. This approach will be illustrated with a case study based numbers. Application of six sigma to software will be discussed. A comparison of lean six sigma to CMM will be provided together with an indication of how lean six sigma and CMM can be used together.

This tutorial will close with a discussion of how to apply LSS to the acquisition of software intensive systems. The use of LSS during the pre-solicitation, proposal evaluation and contract oversight phases of acquisition will be illustrated.


About the instructor:

Kenneth Shere is recognized as an expert in systems and software engineering, process improvement, and quality assurance. He has extensive experience in strategic and business planning, and has facilitated leadership activities for several customers. He developed the software methodology for leading corporations, and provided expert witness services on software business practices to the largest American firm practicing corporate law. Kenneth Shere has given tutorials internationally on the subjects of quality assurance and software engineering. He is the author of a book, Software Engineering and Management, published by Prentice Hall, editor and contributor of a book on command and control, and author of over 20 papers in refereed journals on a wide variety of topics. He has worked on application areas ranging from logistics to satellite systems.

Dr. Shere has provided consulting on major software intensive systems to all military services, the National Reconnaissance Office (NRO), the National Oceanic and Atmospheric Administration (NOAA), the Federal Aviation Administration (FAA), and the Federal Bureau of Investigation (FBI). He has been an employee of The Aerospace Corporation for ten years. Previously, he was an independent consultant and partner in a small systems engineering company for ten years. Earlier he worked in industry and government where he had been manager and technical director of groups ranging from 6 to 180 people.

Dr. Shere earned a B.S. in Aeronautical and Astronautical Engineering, an M.S. in Mathematics and a Ph. D. in Applied Mathematics, each from the University of Illinois. He was trained as a six sigma green belt by Lockheed Martin Corporation, and certified as a software capability evaluator by the Software Engineering Institute.

Tutorial 8: Testing Object-Oriented and Web-Based Applications -- -- Instructor: David C. Kung
9/18/2006 (Monday)
13:30-17:00

David C. Kung, Ph.D.
University of Texas at Arlington
kung@cse.uta.edu

Abstract:

Software quality is an important aspect of a software system. Software testing is a software quality assurance activity to ensure that the desired software quality objectives are met. Although numerous software testing methods have been reported in the literature, how can a practitioner implement and apply the test methods in practice with minimal effort and maximal gain is usually not addressed. This is also true for object-oriented (OO) software and web-based application testing.

The proposed tutorial is aimed to provide a practical introduction to methods and tools for OO software and web-based application testing. The emphasis will be the use of existing free software tools to implement and apply the test generation methods in practice. In particular, an integrated framework for streamlining several free software tools to implement various test methods will be presented and demonstrated. The participants will have hands-on experience to testing OO software using a prototype of the framework. By the end of the tutorial the participants will gain a basic understanding of OO software testing, know how to use the tools to generate and execute test cases and analyze the test results with respect to software quality requirements objectives.

This tutorial is aimed for OO software and web-based application developers and testers. In addition, software project managers can benefit from this tutorial by gaining a basic understanding of software testing in general and what are the available free resources. The materials presented in this tutorial may also be useful for instructors who is offering or planning to offer an OO software and/or web-based application testing course.


About the instructor:

David Kung is a full professor of the Department of Computer Science and Engineering at The University of Texas at Arlington. He has more than 25 years software engineering experience working in academia and industry. He is in close contact with numerous companies in terms of technical consulting, technology transfer, training and re- search cooperation. He has worked in the area of testing OO software and Web applications since 1992. He has published and edited three books and more than 100 technical papers in ACM, IEEE and international journals and conference proceedings. He and his colleagues and students have designed and implemented the OO software testing and maintenance toolkit, called OOTWorks, which has been licensed to some companies.