|
|
|
|
> Congress /Programme / Plenary keynote addresses
 |
Plenary keynote addresses |
|
Monday 23 August, 10h30 - 12h
Internet Governance
Moderator: Jacques Berleur, Notre-Dame- de- la- Paix U., Belgium
|
Paul Twomey
President and CEO, ICANN (Internet Corporation for Assigned Names and Numbers), USA
|
Technical Coordination, Concepts of Governance and the need to support a rapidly globalising Internet
Abstract
The Internet and its supported economic and social interests have grown in geometric proportions over the last 15 years. At the core of this growth has been the innovation and coordination among many technical and private players.
The expansion of the network has been a product of a balance of inputs and interests among technical, academic, business, civil society and governments. As the Internet becomes more pervasive, supporting economic and social interaction, its relationship with pre-existing concepts of governance and social norm-setting has become topical. The important distinction between technical coordination and the governance of human interactions and socio-economic outcomes is essential. |
|
|
Biography
Dr. Twomey was elected President and CEO of ICANN in March 2003.
ICANN, is the international not-for-profit corporation charged with technical and policy coordination of the Internet protocol, including address and domain name functions. Dr. Twomey brings a wealth of experience to ICANN. Dr.Twomey’s experience with ICANN extends to having been closely involved with the international reform of the Internet's technical rule making, and as Chair
of the Governmental Advisory Committee (GAC) for ICANN for three years ending November 2002. As Chair of the forty plus member GAC, he played a key influential role in the formulation of the technical and administrative rules for the commercial Internet globally. Dr. Paul Twomey’s background lends an appropriate balance of public/private experience. As the founder of Argo P@cific, a high-level international advisory and investment firm, his firm assists companies in building global internet and technology-businesses. Argo P@cific works with Fortune 500 companies and start-up entrepreneurial firms in the first stages of business to build international businesses and strategic alliances.
Prior to Argo P@cific, Twomey was founding Chief Executive Officer of the National Office for the Information Economy (NOIE), and the
Australian government's Special Adviser for the Information Economy and Technology. He reported directly to the Minister for Communications, Information Technology and the Arts and to a Cabinet Sub-committee: the Ministerial Council for the Information Economy.
Established in 1997, NOIE is Australia's lead Commonwealth agency for information economy issues. In his position as Special Adviser to the Australian government, Dr. Twomey was charged with providing strategic advice on its information economy and technology priorities and strategies, including developing a National Strategy for the Information Economy. He was also Australia’s
representative at international fora, such as the World Trade Organisation, the OECD and APEC, to ensure that Australia's interests
were promoted during the current formation of the rules and regulations of the international information economy. |

|
Christopher Wilkinson
Adviser Directorate General Information Society, European Commission; Head of Secretariat, ICANN Governmental Advisory Committee (GAC)
|
Public Policy for Internet Governance
Abstract
The management of the Internet naming and addressing system has been assigned to a public-private partnership in the context of the ICANN organisation and the GAC. Although this system is still experimental, it has been much improved during the past two years. Problems have been identified and are being addressed; they would be present under any alternative arrangement. Effective global participation is key in this respect; both public and private. Public participation must include the full range of languages, levels of development and geographical regions. Private participation is not limited to operators. Users and civil society must also play their roles.
In future, ICANN and the governments will have to address more in-depth issues within the ICANN mandate such as IDN and IPv6. Governments will also have to address issues that are currently outside the ICANN mandate. This may give rise to different international systems, or to other solutions.
Meanwhile, work must go on. The Internet does not stop for international consultation. A global public-private partnership such as ICANN-GAC requires a lot of hard work. Both private operators and public officials have to put in time and energy to achieve results. At its best, this can be more efficient than formal inter-governmental procedures. But it is not cost-less. A significant flow of public and private resources is necessary, on a permanent basis. |
|
|
|
Biography
Christopher Wilkinson is Head of the GAC Secretariat and Adviser, Directorate General for Information Society, European Commission. The GAC Secretariat supports the work of the ICANN Government Advisory Committee. http://www.gac.icann.org
Following a career with the Commonwealth Secretariat, OECD and the World Bank, Mr. Wilkinson joined the European Commission as Head of Division in 1973. He worked successively in Regional Policy, Industrial Affairs, Information Technology and Telecommunications, including responsibility for international aspects of information technology and telecommunications.
He was involved in the initial constitution of the ICANN organisation and represented the EU in the GAC from its inception in 1999. He was a GAC Vice Chair, 2001-2002.
Mr. Wilkinson was educated in England, in Yorkshire and at Cambridge university, where he took degrees in Natural Science and Economics. He has also studied at the London Business School and at Harvard University. He taught economics at Holland Park Polytechnic (1962-63) and at the College of Europe, Bruges (1989-90). |

| Tuesday 24 August, 9h - 10h |
Hervé Gallaire
President, Xerox Innovation Group for Xerox Corporation, corporate senior vice president and Chief Technology Officer
|
Innovation and Information Processing
Abstract
Technical progress in Information Processing has been extraordinary over the last 40 years. In this presentation we will briefly review some of the most significant improvements in both hardware and software, from micro-electronics to storage, from algorithms to databases. While scientific progress has been key to this progress, the next wave of innovation will come from two different directions.
First, a number of application domains are going to drive further innovation at a fast pace. The development of MEMS technology will enable inexpensive and miniature sensors and actuators of all types. These will create innovation opportunities in major industries, including bioengineering, health, device service, as well as in the home. To realize these opportunities we will face system and software challenges. Since most of our systems design knowledge has its roots in the mechanical systems world, the applications and evolution of MEMS technologies will force us to rethink systems design and new digital paradigms will take hold. The development of new web services standards is also making great headway. This will be further accelerated by the better integration of information streams, including structured and unstructured information. This integration is key to full automation of business processes. The enablers for this transformation are not only the web service standards, including XML, but also the natural language processing technologies that have become significantly mature. The presentation will review examples of these capabilities. There is no doubt that intelligent systems still need development to reach the next level of automation.
It is worth noting that a great deal of IT progress has been due to the creation of new knowledge in other sciences, like physics and chemistry, which drove advances in microelectronics, storage and communications. Therefore, the second driver for the next wave of innovation in IT will in fact also rely on scientific progress in other disciplines. In addition to continual improvements in the silicon based technology, today we are seeing the emergence of new semiconductor materials and processing methods based on organic and polymer technologies. These technologies will shape the future of display technologies, and also MEMS, thereby closing the loop to where we started from. A number of recent inventions will be described and ideas of innovative applications will be given. |

|
|
Biography
Hervé Gallaire is President, Xerox Innovation Group for Xerox Corporation. He was appointed to this position October, 2001. He is also a corporate senior vice president and Chief Technology Officer.
He is responsible for overseeing the worldwide research and technology organizations in Xerox, to maximize the company's $1billion annual investment in research and development. He is also responsible for intellectual property management, licensing and value creation from the intellectual property and technology. Additionally, he oversees the operations of a number of companies in the Xerox portfolio.
As head of research and technology, Gallaire leads an organization that is one of the world's most prolific generators of patentable ideas, with world-renowned research and technology centers that include the PARC Incorporated, the Xerox Research Centre of Europe, in Grenoble, the Xerox Research Centre of Canada, and the Wilson Center for Research and Technology in Webster, N.Y. Finally, the Imaging & Services Technology Center, and the Xerox Engineering Center are distributed over several locations in the USA.
From 1993 to 1998 he served as manager of the Xerox Research Centre Europe in Grenoble, France, which develops document technology for multilingual and multinational uses. Most recently, Gallaire held the position of Chief Architect for the corporation and Senior Vice-President of the Research and Technology group of Xerox Corporation, based in Stamford, Conn.
Before joining Xerox in 1992, Gallaire headed the department of mathematics and computer science at l'École Nationale Supérieure de l'Aéronautique et de l'Espace in Toulouse, France and directed several private and public research laboratories in Europe before managing hardware and software development divisions at Bull and GSI.
Gallaire holds a master of science degree and a PhD from the University of California, Berkeley in electrical engineering and computer science and from École Nationale Supérieure des Arts et Métiers in France a master degree in mechanical engineering. He is the first recipient of the distinguished alumnus award of the EECS department from the University of California, Berkeley. He was elected founding member of Académie des technologies in France December 2000. The Academy is the equivalent of the National Academy of Engineering in the United States. |

Wednesday 25 August, 9h - 10h |
Victor R. Basili
University of Maryland

|
download the presentation
The Role of Empirical
Study in Software
Engineering
Abstract
Although most scientific and engineering disciplines view empiricism as
a basic aspect of their discipline, that view has not been the tradition in
software engineering. There is not the same symbiotic relationship
between theory and empirical study, each feeding the other for the
evolution of the discipline. This talk discusses of the role of empirical
study plays in the understanding and improvement of the software
product and process. It offers an historical perspective of the evolution
of empirical methods and their application over time and provides a wide
ranging set of example application of empirical methods to demonstrate
various kinds of roles that empiricism can play. The examples are taken
from the author's own experience and include the use of empirical study
to improve an organization's product quality and productivity
(NASA/Goddard), a series of experiments used to evolve a particular
analytic technique (software artifact inspection), and current work on
evaluating the effectiveness of various interventions for us in improving
mission-critical software, studying the relationship between development
and performance of high end computing systems, and the development
of an empirically-based repository of software practices. |
|
|
Biography
Dr. Victor R. Basili is Professor of Computer Science at the University of
Maryland, College Park and Executive Director of the Fraunhofer Center
- Maryland. He was and one of the founders and principals in the
Software Engineering Laboratory (SEL) at NASA/GSFC. He works on
measuring, evaluating, and improving the software development process
and product. He worked on the development of mechanisms for
observing and evolving knowledge through empirical research, e.g., the
Goal/Question /Metric Approach, The Quality Improvement Paradigm,
the Experience Factory. He is a recipient many awards including a 1989
NASA Group Achievement Award, a 1990 NASA/GSFC Productivity
Improvement and Quality Enhancement Award, the 1997 Award for
Outstanding Achievement in Mathematics and Computer Science by the
Washington Academy of Sciences, the 2000 Outstanding Research
Award from ACM SIGSOFT and the 2003 Harlan Mills Award for the IEEE
Computer Society. Dr. Basili has authored over 200 papers, has served
as Editor-in-Chief of the IEEE TSE, and as 1982 Program Chair and 1993
General Chair of ICSE, respectively. He is co-editor-in-chief of the
International Journal of Empirical Software Engineering. He is an IEEE
and ACM Fellow. He received his Ph.D. in Computer Science from the
University of Texas in 1970. |
Thursday 26 August, 9h - 10h |
Robin Milner
University of Cambridge

|
download the presentation
Grand Challenges in Computing Research: the Global Ubiquitous Computer
Abstract
The UK Computing research Committee has launched a programme of Grand Challenges, to focus the long-term aspirations of the computing research community (national and international) both in science and in engineering. At present there are seven proposals for such challenges, arising from ideas submitted to a workshop in 2002. For each proposal there is a core group of researchers aiming to form a road-map.
Two of these proposed Challenges involve what may be called the Global Ubiquitous Computer; it subsumes both the Internet and instrumented environments. Its name reflects the reasonable prediction that, within two decades, virtually all computing agents (heart-monitors, satellites, laptops, ...) will be interconnected, forming an organism that is partly artefact and partly natural phenomenon -- in either case one of the most complex ever constructed or studied. What models help us to understand it? What engineering principles can cope with the vast range of magnitudes involved?
My lecture will consider how to begin to address these two Challenges. Very many concepts are involved. They include authenticity, beliefs, connectivity, compilation, continuum, data-protection, delegation, duties, provenance, failure, intentions, locality, model-checking, mobility, obligations, reflectivity, security, simulation, specification, stochastics, trust, and many more.
Models are needed that explain and implement some of these concepts in terms of others. I shall end the lecture by describing some of my own work in modelling connectivity, locality and mobility. These notions arise naturally out of our existing models of concurrent computation, and can help to lay a foundation for global ubiquitous computation. |
|
|
Biography
Robin Milner graduated at the University of Cambridge in 1957, in Mathematics and Philosophy. He worked in Ferranti Ltd (1960-63), The City University (1963-68), University College Swansea (1968-70), and the Artificial Intelligence Laboratory at Stanford University (1971-72). He joined the University of Edinburgh in 1973, became Professor of Computation Theory there in 1984, and with colleagues founded the Laboratory for Foundation of Computer Science there in 1986. He was elected Fellow of the Royal Society in 1988, and in 1991 won the A.M. Turing Award. He was appointed Professor at Cambridge in 1995, headed the Computer Laboratory there from January 1996 to October 1999, then became a Research Professor until retirement in 2001. In retirement he is fully active in research.
His implemented logical system LCF (Logic for Computable Functions) was a model for several later systems for computer-assisted reasoning. He led a team which designed and defined Standard ML, an industry-scale programming language whose semantic definition is fully rigorous. His main contribution has been to the theory of concurrent computation, especially the Calculus of Communicating Systems (CCS) and the Pi Calculus, reported in two books: "Communication and Concurrency" (Prentice Hall 1989) and "Communicating and Mobile Systems: the Pi Calculus" (CUP 1999). He now works on rigorous models of mobile informatic systems, reconciling virtual and real notions of locality, connectivity and mobility. |

|
|
|
 |
 |
 |
|

The Final Programme is online!
We are very proud to present an extremely attractive programme that offers more than six hundred presentations.
This very rich programme offers a large variety of opportunities. Attendees will be able to compose their own menu, by mixing on-the-edge research and state-of-the-practice results in their own field of expertise, together with surveys and prospective views in other domains of interest. The overall schedule of sessions and the social events have been designed to facilitate fruitful interactions between attendees.
Join us during a week and share l'esprit de Toulouse !
Jean Claude Laprie
> Posters presenting WCC 2004 will be sent
on request .
We rely on you to ensure a large publicity to WCC 2004.
|
|
|
|
|