Events: Informations- och kommunikationsteknik events at Chalmers University of TechnologyWed, 24 May 2017 11:25:44 +0200 al Nahas, Computer Science and Engineering<p>KB</p><p>Synchronous Protocols for Low-Power Wireless: Towards Reliable and Low-Latency Autonomous Networking for the Internet of Things and Wireless Sensor Networks</p>With the emergence of the Internet of Things, autonomous vehicles and the Industry 4.0, the need for reliable yet dynamic connectivity solutions is arising. Many of these applications build their operation on distributed consensus. For example, networked cooperative robots and UAVs agree on manoeuvres to execute, and industrial control systems agree on set-points for actuators. Many applications are mission- and safety-critical, too. Failures could cost lives or incur economic losses. <br /><br />Any wireless network connecting safety-critical devices must be reliable, and often energy-efficient, as many devices are battery powered and we expect them to last for years. It shall be self-forming and self-fixing as well, to allow for reliable autonomous operation; as many applications cannot afford to stop and wait for external configuration. In this context, synchronised communication has emerged as a prime option for low-power critical applications. Solutions such as Chaos or Time Slotted Channel Hopping (TSCH) have demonstrated end-to-end reliability upwards of 99.99 percent. <br /><br />In this thesis, we design and implement protocols to support highly reliable and low latency communication in low-power wireless settings. First, we present a standard-based solution that integrates with the 6TiSCH stack (IPv6 over TSCH) without the need of static scheduling or schedule negotiation. Second, we identify key challenges when it comes to implementing the 6TiSCH stack, and demonstrate how these challenges can be addressed. Then, we take a step beyond the standards and focus on synchronous network flooding such as Glossy and Chaos. We show how to enhance them by adding time-slotting and frequency diversity to achieve high reliability and low latency under interference. Finally, we design and realise a network stack that combines and extends ideas from TSCH and synchronous transmissions to achieve highly reliable data delivery with a loss rate lower than x and achieve network-wide consensus with a radio duty cycle of 0.5 percent. On top of this robust kernel, we enable two- and three-phase commit protocols to provide network-wide consensus. <br /><br />We implement our protocols, evaluate them on public testbeds of sensor nodes equipped with IEEE 802.15.4 compatible radios and compare to state-of-the-art protocols. We contribute the source code of our main protocols to the community as a step towards enabling ubiquitous connectivity in the context of the Internet of Things. FOR A SUSTAINABLE SOCIETY<p>Chalmers conference center, Johanneberg</p><p>​Embodied, Embedded, Networked, Empowered through Information, Computation &amp; Cognition! Gothenburg, Sweden, 12-16 June 2017</p>​<img src="/SiteCollectionImages/Areas%20of%20Advance/Information%20and%20Communication%20Technology/News%20events/Calendar/is4si-logo.jpg" class="chalmersPosition-FloatRight" alt="" style="margin:5px" /><br />Welcome to the IS4SI 2017 Summit, presented by the International Society for Information Studies.<br /><br />Go to conference web: <a href="" target="_blank"></a> to IFIPTM 2017 in Gothenburg!<p>Jupiter building, Lindholmen.</p><p></p>IFIPTM, the IFIP WG 11.11 International Conference on Trust Management is coming to Gothenburg! <br />The trust management community will convene for five days and engage in inspiring talks, technical demonstrations, a graduate symposium, and many other activities. The mission of the IFIP TM Conference is to share research solutions to problems of Trust and Trust management, including related Security and Privacy issues, and to identify new issues and directions for future research and development work. Sangchoolie, Computer Science and Engineering<p>EA</p><p>On Efficient Measurement of the Impact of Hardware Errors in Computer Systems</p><img src="/SiteCollectionImages/Institutioner/DoIT/Profile%20pictures/CE/Behrooz.jpg" class="chalmersPosition-FloatRight" alt="" style="margin:5px;width:140px;height:182px" />Technology and voltage scaling is making integrated circuits increasingly susceptible to failures caused by soft errors. The source of soft errors are temporary hardware faults that alter data and signals in digital circuits. Soft errors are predominately caused by ionizing particles, electrical noise and wear-out effects, but may also occur as a result of marginal circuit designs and manufacturing process variations. <br /><br />Modern computers are equipped with a range of hardware and software based mechanisms for detecting and correcting soft errors, as well as other types of hardware errors. While these mechanisms can handle a variety of errors and error types, protecting a computer completely from the effects of soft errors is technically and economically infeasible. Hence, in applications where reliability and data integrity is of primary concern, it is desirable to assess and measure the system's ability to detect and correct soft errors. <br /><br />This thesis is devoted to the problem of measuring hardware error sensitivity of computer systems. We define hardware error sensitivity as the probability that a hardware error results in an undetected erroneous output. Since the complexity of computer systems makes it extremely demanding to assess the effectiveness of error handling mechanisms analytically, error sensitivity and related measures, e.g., error coverage, are in practice determined experimentally by means of fault injection experiments. <br /><br />The error sensitivity of a computer system depends not only on the design of its error handling mechanism, but also on the program executed by the computer. In addition, measurements of error sensitivity is affected by the experimental set-up, including how and where the errors are injected, and the assumptions about how soft errors are manifested, i.e., the error model. This thesis identifies and investigates six parameters, or sources of variation, that affect measurements of error sensitivity. These parameters consist of two subgroups, those that deal with systems characteristics, namely, (i) the input processed by a program, (ii) the program's source code implementation, (iii) the level of compiler optimization; and those that deal with measurement setup, namely, (iv) the number of bits that are targeted in each experiment, (v) the target location in which faults are injected, (vi) the time of injection. <br /><br />To accurately measure the error sensitivity of a system, one needs to conduct several sets of fault injection experiments by varying different sources of variations. As these experiments are quite time-consuming, it is desirable to improve the efficiency of fault injection-based measurement of error sensitivity. To this end, the thesis proposes and evaluates different error space optimization and error space pruning techniques to reduce the time and effort needed to measure the error sensitivity. Streaming in Big Data Analysis<p>EC</p><p>​Docent lecture by Vincenzo Gulisano.</p>​<img src="/SiteCollectionImages/Institutioner/DoIT/Profile%20pictures/NS/Vincenzo.gif" class="chalmersPosition-FloatRight" alt="" style="margin:5px;width:135px;height:176px" />Since the year 2000, data streaming has gained popularity over the store-then-process database paradigm and enabled for high-throughput and low-latency continuous analysis of unbounded sources of data. Almost two decades after, it can be particularly helpful in the processing and analysis of big data in cyber-physical systems such as smart grids or vehicular networks. This lecture presents the motivations and challenges behind data streaming, overviews the basics of the streaming paradigm and discusses the research paths that exist in cyber-physical systems’ big data analysis. ACM Symposium on Virtual Reality Software and Technology (VRST) 2017<p>Chalmers Lindholmen, Gothenburg</p><p>​VRST will provide an opportunity for VR researchers to interact, share new results, show live demonstrations of their work, and discuss emerging directions for the field.</p>​ <br />The ACM Symposium on Virtual Reality Software and Technology (VRST) is an international forum for the exchange of experience and knowledge among researchers and developers concerned with virtual reality software and technology. <br /><br /><a href="">To conference website:</a><br /><br />