While the Internet of Things is enabled by technological advancements in computing, communications and data storage, the success of its applications will largely depend on non-technical matters. Acceptance of use-cases by end users, trust in systems and organisations and redistribution of value in changing business models are just a few examples. Increasingly those aspects become important design factors in the architecture of IoT.
What does that mean for professionals in information technology in the future?
Kees van der Klauw graduated from the department of Electronics Engineering of the Delft University of Technology in the Netherlands and received a Ph.D. in the area of semiconductor devices (CCD’s) in 1987. He joined Philips Research where he worked several years on the design and characterization of CMOS devices and processes. In 1992 he moved to Philips’ Flat Panel Displays where he held positions in project management, engineering-, operations- and general management of Philips LCD activities and was involved in the establishment of Philips’ LCD joint ventures in Japan and Korea. In 1999, he joined Philips Consumer Electronics and became the development manager for High-End TV in Bruges, Belgium. From 2003 he was in charge of worldwide platform development for Philips Television and in 2005 became CTO of Philips Television, Monitors and Professional Display Business. Kees joined Philips Lighting in 2009, where he was the Chief Architect and the R&D Manager for Professional Lighting Solutions. From October 2013, he has been the Head of the Research for Philips Lighting and he played a key role in the split of corporate Philips Innovation in a Healthcare and Lighting Company.
Starting in 2018, Kees has created the independent Innovation Consultancy and Interim Management Company, InnoAdds.
Since 2015 Kees has been involved in the establishment of the Alliance for Internet of Things Innovation (AIOTI) and he is currently the first elected chairman of this registered Association, driving IoT innovation in Europe.
The running trends in computer science research in “data and information management“ are: ‘Big data’, 'data analysis/analytics/ science' and 'machine learning'.
“Data is the gold of the information age", or "Data scientist is the sexiest job of the 21st century” are well-known statements in politics, business and in science.
However, beside all these generally accepted trends and statements, the basic needs of industrial software development arise from the complex problem of „integration of software ('interoperability') AND data (at the same time)“, having a market share of 80% (!) of the total software development in the world. These problems heavily impact the ‚every-day-bread-and-butter‘ workload in software companies, most of them SMEs in a deeply specialized 'domain specific' market.
This presentation will give an overview, and, additionally, technicals details and many examples on a long personal record of successful experiences in information systems development & integration methodology, based on three essential paradigms:
- Model (and meta-model) based methods for MBSDI: Why is it (…still and more than ever before …) useful to “waste" a lot of time in the initial phases of software development: software and information modeling?
- Semantic concepts and ontologies: Why mathematics and formal logic (…still and more than ever before …) is an essential skill for each computer scientist and each software developer?
- Patterns: Why we should not re-invent the wheels in each decade from scratch, why not learn from the past?
In the context of BDAS, we will focus on the methodology of MBSDI, and on concrete achievements in information centric solutions for industry.
- High school education in Berlin at a „humanistic school“ (Goethe-Gymnasium), i.e. 9 yrs. Latin language, 5 yrs. Old Greek language studies … poooh … (German „Abitur“ in 1974)
- Diploma degree in Mathematics (Dipl.-Math.) at FU Berlin with a minor in C.S. at TU Berlin and some other short minors in Philosophy, Psychology, and Linguistics
- Diploma thesis (in German language): „Differenzenverfahren für mehrdimensionale Diffusions-Konvektions-Probleme“, under supervision of Prof. Dr. Rudolf Gorenflo, 1984, 163 pp. + code listings in appendix.
- Research associate in Theoretical Computer Science (ATFS department, later TAL group, Prof.Siefkes) at TU Berlin
- Textbook (in German language): Dieter Hofbauer, Ralf-D. Kutsche, „Grundlagen des Maschinellen Beweisens“, Vieweg, 1st and 2nd ed.,1989/1991.
- Research Associate at German Heart Institute Berlin („DHZB“), analysing/designing complex integration and interoperability solutions for the cardiology dept. and externals
- Result: Ph.D. at TU Berlin under supervision of Prof. Dr. Bernd Mahr (TU Berlin) and Prof. Dr. Hans-Dieter Ehrich (TU Braunschweig, 2nd reviewer) in applications of formal methods to clinical information systems – “A type-oriented approach to the specification and formal semantics of a distributed heterogeneous object system”, PhD thesis, TU Berlin, 1994. Title of my defense talk: “Bridging the gap between theory and practice”.
- Senior scientist (Academic Director) at TUB chair 'Computation and Information Structures CIS‘ (Prof. Herbert Weber), establishing the focus area 'Heterogeneous Distributed Information Systems’. At the same time, scientific coordinator, project leader and member of the board of leaders with Fraunhofer ISST (lateron -> Fraunhofer FIRST -> Fraunhofer FOKUS).
- 30 months from 2005 provisional head of the CIS group; finally from 2008 (until … ) Acad. Director under new professorship and chair: Prof. Dr. Volker Markl (research group now called: DIMA group)
- Science Chair of both BIZYCLE (2007 thru 2010, approx. 6.5 mio €) and BIZWARE (2010 thru 2013, approx. 11 mio €) project consortia, a large-scale initiative by TU Berlin and industry (6 resp. 8 SMEs) establishing ”Model Based Software and Data Integration” as a focus area for SMEs in Germany in several business domains, funded by German BMBF (under the frame programme “Regional Growth Cores”).
- Coordinator of DIMA’s international Master’s programmes: ERASMUS MUNDUS “IT4BI” (since 2012), EIT Digital “Data Science track” (since 2015), ERASMUS + (EMMJC) “BDMA” (since 2017)
- Co-Initiator of the international Master’s programme IT4Energy with Faculty III/Campus El Gouna
Now is an exciting time to engage in creative design computing, to implement physically and computationally enhanced environment, to explore experience media and interactive computing projects, towards a smart living environment. Advancing technology offers new ways to solve problems, discover opportunities, and create new objects and experience that delight our senses and improve the way we live and work. With a spark of creativity and enthusiasm, followed up with design and computational thinking, we can explore the goal of “creating unique technology for everyone” through the use of connective, ubiquitous technology for embodiments, in three themes: Tangible Interaction, Augmented Learning, and Embodied Experience.
Ellen Yi-Luen Do is Director of Innovation and Partnership at the ATLAS Institute, and Professor in the Department of Computer Science at the University of Colorado Boulder. Before joining CU in 2017, she was a professor of Georgia Tech’s School of Industrial Design and the School of Interactive Computing. At Georgia Tech she directs the ACME Creativity Machine Group and the Healthcare Design of the Future interdisciplinary R&D initiative for integrating technology into built environments. She was a member of the GVU Center faculty, an affiliate at the Center for Music Technology, and a core faculty at the Health Systems Institute, which hosts her office and lab. Ellen developed the Industrial Design Track of the MS-HCI degree program jointly with the School of Interactive Computing and served as an Associate Director of the program management team. She received a MDesS degree from Harvard University Graduate School of Design (1991) and a Ph.D. in Design Computing from Georgia Tech (1998). Before returning to Georgia Tech as a professor (in 2006), she taught at the University of Washington in Seattle (1999-2004, Design Machine Group) and Carnegie Mellon University (2004–2005, CoDe Lab). Ellen was on leave from Georgia Tech (2013–2016) to serve as the co-director of the Keio-NUS CUTE Center at the National University of Singapore.
Cryptographic algorithms and protocols are one of the fundamental building blocks of modern digital systems. Cryptographic schemes provide security functions and by definition lie at the heart of any secure system. As with any heart failure, failures of cryptographic algorithms and implementations tend to be catastrophic. In environments such as hardware components where patching and updates are difficult it may be even more dramatic, leading to long exposure times or even device recalls. In this talk we are going to go through a number of crypto bugs and weaknesses in various systems, from some well-known CVEs to examples of bugs we prevented by our internal security validation activities. These examples will illustrate different classes of deficiencies: algorithmic fragility, design mistakes, implementation errors and validation gaps. The key takeaway of this talk will be suggestions on how to design secure systems to minimize the chance of cryptographic failures.
Krystian Matusiewicz has a broad range of experience in both academic research and industrial security engineering. He received his PhD in cryptography from Macquarie University in Sydney for his work on cryptanalytic attacks on hash functions. He was a postdoctoral researcher at Technical University of Denmark and a lecturer at Technical University of Wroclaw.
Co-author of constructions such as Groestl (NIST SHA-3 finalist) and ICEPOLE (CAESAR competition) he also authored attacks on a number of cryptographic constructions. Currently, he is working at Intel as a security researcher in Platforms Security Division where he oversees security of some of Intel's products and services. He is a member of Intel cryptographic community and puts his passion for cryptanalysis to work by scrutinizing usages of cryptography in products across Intel.
Aisling Kelliher is an associate professor of Computer Science at Virginia Tech, with joint appointments in the School of Visual Arts and the Institute for Creativity, Arts, and Technology. Aisling creates and studies interactive media systems for enhancing reflection, learning, healing, and communication. She co-leads the Interactive Neurorehabilitation Lab at Virginia Tech where she works with a team of bioengineers, therapists, doctors, designers, and computer scientists in developing interactive systems for stroke rehabilitation in the home. Aisling is a member of the IEEE MultiMedia editorial board and writes or edits the regular “Artful Media” column. She served as the Paper Chair for ACM Creativity and Cognition in 2017 and for ACM Multimedia in 2016. She is also the regular technology correspondent on the “Culture File” show on Irish national radio. Aisling received a Ph.D. in Media, Arts and Sciences from the MIT Media Lab where she was a member of the Interactive Cinema Group. She also holds an MSc. in Multimedia Systems from Trinity College, Dublin, and a B.A. in Communications Studies from Dublin City University.
Is it even possible - The ACM Code of Ethics Journey
The ACM recently completed a multi-year project updating its Code of Ethics and Professional Conduct which they undertook because of the profound changes in the way computing interacts with society and now computing changes even the most basic social infrastructures. These changes required an extensive revisiting the ethical responsibilities of computing professionals.
Professional codes of ethics should represent the global conscience of the profession, not narrow political positions. Codes are about societies obligations to the computing professional. Codes should make clear the rights of computing professionals to be free from unethical work demands. Computing professionals are now asked to work on systems which can surreptitiously censor the Internet, gather data on every aspect of our lives, and develop algorithms which amplify existing human biases when they make judgments that affect society and its citizens.
Codes of ethics should make clear the obligations computing professionals have to the profession at large and their obligations to society. These are obligation about how they approach their work and about how they promote an ethical approach to the profession.
Above all, a code should help the computing professional work through complex ethical decisions. It should actually be of some practical use answering questions like how do you hard code ethics into a computer system, how do you make algorithms accountable, and how to address the risk in machine learning systems. It should also fit on an A4 poster. This talk, using real-world examples from the ACM Code Update project, will focus on several positive lessons learned about the ethics of computing professionals, ethical negotiation in the practice of the profession, and reducing philosophical distractions to practical ethical issues. We shall also discuss ways of facilitating and encouraging professional's attention to the ethical side of technology.
Don Gotterbarn a Professor Emeritus at East Tennessee State University, is a leading author of the Software Engineering Code of Ethics and Professional Practice, which promotes ethics among software engineers. Active in professional computer ethics for over 20 years, Gotterbarn received the 2005 Outstanding Contribution to ACM Award for his leadership as both an educator and practitioner, and for promoting the ethical behavior of computing professionals and organizations. He received the ACM SIGCAS Making a Difference Award in 2002 for his research and work regarding computer and software engineering ethics. Gotterbarn is also an ACM Distinguished Speaker and chairs the ACM Committee on Professional Ethics. In the mid-1970s, he left a career teaching philosophy and entered the computing field as a consultant for clients that included the US Navy and the Saudi Arabian Navy. He has also worked on the certification of software for vote counting machines and missile defense systems.
ICT is now consuming 5-8% of electrical energy worldwide on average with higher values, around 10%, in the developed countries. The amount is growing worldwide at a rate of roughly 5%, and electricity expenditures are a major part of the costs of telecommunications and computer operations. Due to the resulting environmental impact, this may create some day both cost and societal barriers to the development of ICT. Based on our recent research, we will describe how a computer or communication system can better balance the requirements of energy costs and quality of service. We will also discuss some future technologies that can considerably reduce these expenditures in electrical energy.
Sami Erol Gelenbe is a Turkish-French computer scientist, electronic engineer and applied mathematician who is professor in Computer-Communications at Imperial College. Known for pioneering the field of modelling and performance evaluation of computer systems and networks throughout Europe, he invented the random neural network and the eponymous G-networks. His many awards include the ACM SIGMETRICS Life-Time Achievement Award, and the in Memoriam Dennis Gabor Award of the Hungarian Academy of Sciences.
Optimisation of Extraction-transformation-loading (ETL) workflows can be a very complex process. There is a plurality of dimensions in which ETL can be optimised. To name a few: execution time, resource consumption, simplicity of maintenance, reusability and finally total cost of ownership (affected by all the others). On the other side, optimisation can be achieved by many means: datasources tuning, hardware upgrades, parallelisation, pushdown mechanisms and other ELT definition rewritings. Many such optimisations can be already done automatically with proper tooling. There is still a lot of methods waiting for us to be invented and implemented. Together with PUT we investigate efficiency of automated ETL optimisation in area of Hadoop datasources. I am going to provide introduction to these aspects of optimisation supported by real life examples.
Michał Bodziony is a senior performance specialist at IBM. During 20 years of professional experience he played roles of software and performance architect in several projects, always focused on best performance. For several years he was driving architecture of Optim Performance Manager tooling. Then for few years he was a Performance Architect for IBM Pure Data for Analytics (a.k.a Netezza). Recent years he is involved in development of IBM Unified Governance & Integration (focused on ETL performance and overall portfolio security). He is an author of several patents applications and many publications mostly focused on performance optimisation.
Empirical methods in software engineering have gained increasingly wider acceptance in the last 20 years. Experiments, case studies and surveys are increasingly often published in top software engineering journals and conferences. However, there is one more research method which rapidly gains acceptance and spread - Action Research. Action research is a methodology where industrial practice and scientific research goes hand-in-hand to improve the practice and build new theories in software engineering. In this talk, we go through the principles of action research in general and how they are applied in software engineering. We identify characteristics that make action research in software engineering special and we explore ways how to conduct action research in the best way. The talk finishes with examples of software engineering metrics research at Chalmers | University of Gothenburg, conducted in close collaboration with industrial partners.
Mirosław Staroń is a Professor in Software Engineering at University of
Gothenburg, IT Faculty. His research interests include software metrics, mining software repositories, profiling product and organizational performance, Autosar, ISO 26262, automotive software engineering. He has a lot of experience in cooperating with Swedish industry, especially in the area of software metrics and automotive software engineering.
In the Hollywood blockbuster movie about Alan Turing, The Imitation Game, Alan Turing is frequently seen with an Enigma machine - and it's widely assumed that the codebreakers of Bletchley Park had Enigma machines to hand. Actually, the truth is a bit more mysterious, and more complicated. Before July 1939 the British had only the haziest idea of how the German Wehrmacht-model Enigma machine worked, and it may seem miraculous that Alan Turing was able to design a machine method to break Enigma within a few months of the start of the war. How was this possible - and did Alan Turing actually see an Enigma machine?
Dermot Turing graduated from King's College Cambridge and New College Oxford. He spent his career in the legal profession, most recently as a partner of Clifford Chance. Since 2014 he has moved into a more varied range of activities, including an active role as a trustee of Bletchley Park and a volunteer and trustee of the Turing Trust, a charity which sends second-hand computers for a new life in schools in Africa. Dermot Turing is the nephew of Alan Turing and author of a biography on Turing (Prof: Alan Turing Decoded, published in 2015 by The History Press) as well as The Story of Computing, published by Arcturus in 2018. His most recent book is X, Y and Z - the real story of how Enigma was broken (September 2018, The History Press) which explains how the vital groundwork done by Polish code-breakers and French intelligence enabled Alan Turing and the Bletchley Park organisation to achieve its wartime successes.
The growing impact of emerging information technologies such as the internet of things, augmented reality, biometrics, cloud computing and big data on persons’ privacy is explained. The modus operandi of several specific privacy destroying technologies is analyzed, as well as mutual dependencies among presented technologies.
Wojciech Cellary received the M.Sc. (1974), Ph.D. (1977) and Dr.Hab. (1981) degrees all from the Technical University of Poznan. In 1989 he received the title of Professor. From 1974 to 1992 he was with the Technical University of Poznan from 1987 to 1991 serving as the scientific director of the Institute of Computing Science. From 1992 to 1996 he served as the vice-president responsible for research of the Franco-Polish School of New Information and Communication Technologies. In 1996 he joined the Poznan University of Economics. Currently he is the head of the Department of Information Technology. He has been a visiting professor at the following universities in France and Italy: University of Nancy I, University of Nancy II, University of Paris-Sud, University of Paris-Dauphine, University of Genova, and University of Ancona. He has been a leader of numerous research and industrial projects on the development of hardware and software of computer systems and their applications in telecommunications, the computer industry, the electric power industry, education and administration. The projects were supported by Polish, French and American industry, as well as 4th, 5th, and 6th EU Framework Programme. He served as a consultant to the Polish Ministry of Science and Higher Education, Ministry of Administration and Digitalization, Ministry of Telecommunications, Ministry of Regional Development, Ministry of Economy, Polish Parliament, and several research institutes and governmental projects. He has been involved in organization of 38 scientific national and international conferences and he has been a member of the program committees of an additional 360 conferences. He is an author or co-author of numerous publications: 11 books, 21 chapters in books, and over 155 articles in journals and conference proceedings. Beside research and teaching his professional activity encompasses consulting, membership in numerous professional organizations, editorial boards of scientific journals, think tanks, committees, councils, and various associations. He is a recipient of many awards for achievements in research and teaching. He supervised 17 PhD Theses, 4 of them received distinction, 1 of them received two awards in the national contests for outstanding PhD Thesis organized by two professional societies. In 2002 he was a representative of Poland at the General Assembly of United Nations devoted to "ICT for Development".