IBM INNOVATION AWARDS 2018 - OCTOBER 17, 2018 - FRS-FNRS
←
→
Page content transcription
If your browser does not render page correctly, please read the page content below
IBM INNOVATION AWARDS 2018 OCTOBER 17, 2018 Pour tout renseignement: Inlichtingen: Bruno MORAUX Bart VAN BEEK F.R.S. - FNRS FWO bruno.moraux@frs-fnrs.be bart.vanbeek@fwo.be 02/504.92.40 02/550.15.98
IBM INNOVATION AWARDS 2018 Thanks to the generous patronage of the company IBM Belgium, IBM Innovation Awards are granted every year since 1975 by the Fund for Scientific Research-FNRS and the Research Foundation Flanders. These Awards reward the best doctoral theses that present an original contribution to informatics or its applications in one of the following fields: - Cognitive Computing - Social Media - Analytics & Big Data - Mobile Computing - Cyber Security - Cloud Computing The complete regulations of the IBM Innovation Awards are available on www.frs-fnrs.be and www.fwo.be. For the F.R.S. - FNRS, the Award is granted to: Adrien TAYLOR PhD in mathematical engineering - UCL (FRIA Fellowship) Master in mathematical engineering - KU Leuven / UCL Postdoctoral Researcher - École Normale Supérieure / INRIA, France for his PhD thesis : Convex interpolation and performance estimation of first-order methods for convex optimization. The users (mostly non-experts) of numerical optimization algorithms generally need guaranteed performances before spending hours in numerical processes. Those guarantees are crucial, as they allow the user to judge a priori what to expect from the optimization schemes under consideration (e.g., will it work? how much time does it run? etc.) However, in most cases, performance characterization requires a lot of insight and is therefore mostly reserved to experts of the field. Hence, those analyses are often out of reach for the final user. Worst-case analysis belongs to the most iconic and widespread generic approaches for assessing the efficiency of numerical algorithms. Essentially, given a class of problems and an algorithm designed to solve them, worst-case analysis focuses on the problem instances 2
on which the algorithm behaves in the worst possible way. The user may then be satisfied by the algorithm if its behaviour on the worst possible problem instance is reasonable (e.g., the algorithm does not require too much time to proceed, even in the worst case). In this work, we provide a systematic approach to worst-case analyses in the context of (mostly, but not limited to) convex optimization. We aim at rendering this type of analyses accessible to a wider audience, allowing the users to very quickly and rigorously assess the performances of their optimization methods, for a range of standard settings. 3
IBM INNOVATION AWARDS 2018 For the FWO, the Award is granted to : Sujoy SINHA ROY PhD in electrical engineering - KU Leuven Master of Technology elektromechanica - Indian Institute of Technology, India Postdoctoral Researcher - KU Leuven for his PhD thesis Public Key Cryptography on Hardware Platforms: Design and Analysis of Elliptic Curve and Lattice-based Cryptoprecessors. Last November, IBM established a landmark in quantum computing by announcing a 50- qubit quantum computer. Quantum computing certainly has the potential to improve our life by performing tasks that are not feasible using today's computers, such as discovering new drugs and building molecular structures; but it also threatens the existing information security infrastructure. Shor's algorithm running on a powerful quantum computer can break RSA and elliptic-curve-based cryptographic schemes, which are considered as the two pillars of our present-day public-key infrastructure. The state-of-the-art IBM quantum computer is not powerful enough to break the present- day public-key infrastructure, but the giant leap forecasts, that there might be a powerful quantum computer in the future that will break both schemes. Post-quantum cryptography is a branch of cryptography that focuses on designing schemes that are secure against quantum computing attacks. In recent years, several hard problems from lattice theory have become popular for constructing post-quantum public-key cryptographic schemes. Beside post-quantum cryptography, lattice-problems have been used to construct homomorphic encryption schemes. Homomorphic encryption has applications in privacy- preserving cloud computing: users can upload their encrypted data in the cloud and can still perform computation on the encrypted data. While in theory, lattice-based cryptography offers wide applicability, computational efficiency, and strong security, its real deployment in a wide variety of computing devices and applications faces several challenges. My research is aimed at solving the fundamental problem of implementing cryptography based on hard problems in lattices in hardware and software. 4
MEMBERS OF THE JURY F.R.S. - FNRS 2018 DUTOIT Thierry Professor at UMONS LEDUC Guy Professor at ULIÈGE PEREIRA Olivier Professor at UCL STÜTZLE Thomas F.R.S.-FNRS Research Director at ULB 5
MEMBERS OF THE JURY FWO 2018 DEMEESTER Piet Professor at UGent EECKHOUT Lieven Professor at UGent GEERTS Floris Professor at UAntwerpen MOENS Marie-Francine Professor at KU Leuven PRENEEL Bart Professor at KU Leuven * * * VAN HOREBEEK Ivo Academic Relations Belgium-Luxembourg Observer IBM Belgium 6
Adrien TAYLOR UCL “Convex interpolation and performance estimation of first-order methods for convex optimization” 7
Summary Context This work takes place within the field of “mathematical optimization”. More specifically, it deals with the analysis of optimization algorithms. Optimization is omnipresent in sciences, especially when it comes to modelling (e.g., what model best explains my data?) and decision making (e.g., what is the most efficient trajectory on a race track?), with applications ranging from modelling of physical or biological processes, to machine learning, control engineering, signal and image processing, and much more. In a few words This thesis deals with the questions “how do we prove an optimization algorithm works?'' and “what can we expect from an optimization algorithm?'' In that perspective, and skipping all details, the main innovation of this work is the development of a rigorous and systematic approach to worst-case analyses of optimization algorithms. The methodology is applied to the study of first-order methods, with a particular focus on convex optimization (although we do not restrict ourselves to it). This is motivated by the fact first-order methods are currently at the heart of the modern machine learning and big data applications –due to their low (or scalable) computational costs–, whereas convex optimization allows modelling a very large panel of learning problems (including varieties of regression and classification ones), while providing nice computational guarantees. Novelties We develop an automated approach to the analysis of optimization algorithms (i.e., the computer does the analyses for you), based on the computation of achievable worst-case scenarios. The thesis focuses on three important aspects in that direction: (i) rendering worst-case analyses of optimization schemes accessible to a much wider audience of non-experts; (ii) studying scenarios in which even experts have no, or little, insight; and (iii) reducing the gap between theory and practice (i.e., reduce conservatism). The approach (called performance estimation) aims at automatically generating worst- case certificates (i.e., proofs that the given algorithm works) along with corresponding worst-case scenarios: it can be seen as a systematic computer-assisted methodology for performing worst-case analyses. In the case of first-order optimization schemes (e.g., for convex optimization), applying the methodology amounts to solving semidefinite optimization problems (SDP), which can be tackled very efficiently using any mature SDP solver. Note though that the required SDP modelling steps may be tedious, and hence, we provide a toolbox (so-called PESTO) that can perform both the modelling and the analysis. 8
Main results The main results of the thesis can be split in four categories. Rigorous and systematic methodology for (automatically) performing worst-case analyses of first-order methods for (mostly, but not limited to, convex) optimization, using semidefinite programming (SDP). Application of the methodology to the analysis and design of optimization schemes, that include projected, proximal, conditional and inexact gradient schemes, along with accelerated variants. In particular, we establish new (tight) analytical worst- case guarantees for the proximal point algorithm, for steepest descent with exact line-search, and for the (proximal) gradient method. A toolbox, allowing other researchers to use the framework without worrying about the (potentially tedious) modelling steps involving SDPs. In addition to the automation of the modelling steps, it allows minimizing the probability of doing mistakes in the process, by allowing the users to write human readable codes. Theoretical results on nonparametric interpolation over several classes of functions (including smooth strongly convex functions). Those results are among the main building blocks of the performance estimation methodology but are also of independent interest (e.g., for performing specific convex regressions). Conclusion and perspectives The performance estimation approach is a totally novel way of automatically evaluating the quality of optimization algorithms. Based on original ideas by Yoel Drori (Google) and Marc Teboulle (Tel-Aviv University), who developed the methodology in specific cases, we pose theoretical foundations for this approach, and propose a clean and largely extended version of the framework for analysing first-order methods. Simply put, the evaluation of standard (first-order, and mostly convex) optimization algorithm's worst-case performances are formulated as optimization problems, that we can solve exactly in surprisingly many situations. This allows obtaining exact worst-case performances together with examples of “worst” functions and initial conditions achieving them. We expect the PEP approach to play a disruptive role in the analysis and design of optimization methods, as researchers are now able to immediately test the quality of the algorithms they think of, without even needing to understand the theory behind the PEP, and are provided with examples of the functions on which their algorithms perform poorly. This should rapidly guide them in the iterative improvement of their methods. Finally, as modern optimization settings tend to become more and more complicated (e.g., in stochastic and distributed optimization), automated analysis tools may become very valuable assets even for experts. 9
Curriculum Vitae 10
11
12
Sujoy SINHA ROY KU Leuven "Public Key Cryptography on Hardware Platforms: Design and Analysis of Elliptic Curve and Lattice-based Cryptoprecessors" 13
Summary The main contribution of my PhD thesis is the study of implementation aspects of ring Learning with Errors (ring-LWE) based cryptography. Ring-LWE is a hard problem in lattices. Cryptographic schemes based on the ring-LWE problem perform arithmetic operations in a polynomial ring and require high-precision sampling from a discrete Gaussian distribution. Discrete Gaussian sampling is an integral part of many lattice-based cryptosystems such as public-key encryption schemes, digital signature schemes and homomorphic encryption schemes. We chose the Knuth-Yao sampling algorithm and proposed a novel implementation of the algorithm based on an efficient traversal of the discrete distribution generating tree. We investigated various optimization techniques to achieve minimum area and computation time. Next, we studied timing and power analysis based attacks on the Knuth-Yao sampler and proposed a random shuffling countermeasure to protect the Gaussian distributed samples against such attacks. For efficient polynomial multiplication, we applied the Number Theoretic Transform (NTT). We proposed three optimization techniques for the NTT to speed up computation and reduce resource requirement. Finally, at the system level, we also proposed an optimization of the ring-LWE encryption that reduces the number of NTT operations. We used these computational optimizations along with several architectural optimizations to design a compact instruction-set ring-LWE public-key encryption processor on FPGA platforms. The processor requires only 20/9μs to compute encryption/decryption on a Xilinx FPGA, which is roughly 10 times faster than present-day elliptic curve based public-key encryption. Beside application in post-quantum secure public-key cryptography, the ring-LWE problem has been used to construct homomorphic encryption schemes for privacy-preserving cloud computing. However, homomorphic encryption is very slow in software due to its arithmetic involving very large polynomials with large coefficients. We designed a modular FPGA- based hardware accelerator for all building blocks required to instantiate the somewhat homomorphic encryption scheme YASHE. We investigated efficient arithmetic to parallelize the costly polynomial arithmetic. We observed that though the computation intensive arithmetic can be accelerated, the overhead of external memory access becomes a bottleneck. Then we proposed a more practical scheme that interpolates between fully homomorphic encryption (FHE) and multiparty computation (MPC) and uses a special module to assist homomorphic function evaluation in less time. With this module we can evaluate encrypted search in the cloud server, roughly 20 times faster than the implementation without this module. In addition to the design, analysis and implementation of lattice-based cryptography, one chapter of my thesis implemented a high-security elliptic curve cryptography coprocessor for resource-constrained Internet of Things (IoT) devices. Koblitz curves are a class of computationally efficient elliptic-curves that offer fast point multiplications if the scalars are given as specific τ-adic expansions. This needs conversion from integer scalars to equivalent 14
τ-adic expansions. We proposed the first lightweight variant of the scalar conversion algorithm and introduced the first lightweight implementation of Koblitz curves that includes the scalar conversion. We also included countermeasures against side-channel attacks making the coprocessor the first lightweight coprocessor for Koblitz curves that includes a set of countermeasures against timing attacks, SPA, DPA and safe-error fault attacks. The coprocessor consumes only 4300 gates and 9.56 μJ per point multiplication. When this doctoral research started, little was known about the practical aspects of lattice- based cryptography. This thesis was one of the first few to shed light upon the implementation aspects. The following are the major conclusion drawn from this research. Ring-LWE-based public-key cryptography is fast. In this thesis we investigated the implementation aspects of public-key encryption based on the ring learning with errors (ring-LWE) problem, which is presumed to be secure against quantum computers. We analyzed the arithmetic primitives, namely discrete Gaussian sampling and polynomial arithmetic. We showed that high precision discrete Gaussian sampling can be implemented using a very small amount of resources following an adaptation of the Knuth-Yao algorithm. Our sampler architecture is also very fast. For polynomial multiplication, we used the NTT method coupled with additional optimizations in the computation steps and the architecture. As a result of our design decisions and optimization strategies, the implemented public-key encryption processor achieves very fast computation time (48/21μs per encryption/decryption) while using minimum area and memory. Hardware accelerates SHE, yet not enough. When we started this research, very few literature existed on implementing homomorphic encryption schemes in hardware. We designed the first hardware architecture of the building blocks required for the ring-LWE-based homomorphic encryption scheme YASHE. To accelerate the arithmetic on large polynomials with large coefficient size, we proposed computational and architectural optimizations. For a proof of concept implementation on a Xilinx ML605 board, it turned out that the arithmetic operations can be accelerated using the FPGA; but the large data transfer which happens between the FPGA and the external memory, slows down the speed. With a more advanced memory interface and a larger FPGA, we can achieve significant speedup with respect to software implementations. But even then, the speedup will not be able to make homomorphic encryption fast enough for deployment on cloud computers. With some trust, FHE is close to being practical. Since hardware accelerators are not fast enough, we designed a special hardware module to assist homomorphic function evaluation. The security of this module is related to the security of a multiparty computation scheme; hence there is a certain amount of trust involved. We evaluated encrypted search as an example application, and observed that the recryption box can accelerate the encrypted search by a factor of twenty. 15
Major future directions after my PhD thesis: Post-quantum cryptography for IoT: From this thesis we can conclude that ring-LWE problem based public-key cryptography is as computationally intensive as the classical public-key schemes. IoT devices are constrained by the amount of available resources such as computation capability, storage or memory, power and energy consumption. In this research, we mainly investigated the implementation aspects of ring-LWEbased post-quantum public-key cryptography and performed implementation specific optimizations targeting fast computation time. An interesting direction for future work would be to investigate lightweight design methodology taking into account the limitations of IoT devices such as small silicon area and low energy/power consumption. This would require mathematical or systemlevel optimizations and modifications tailored towards IoT. Ring-LWE-based cryptography relies a lot on polynomial arithmetic. Hence it would be interesting to design algorithms that could perform polynomial arithmetic by consuming a very small amount of resources. Customized FPGAs for homomorphic schemes: In this thesis we observed that the memory access overhead is the main factor in restricting the speed of homomorphic evaluation using FPGAs. Hence designing of FPGAs specific to homomorphic function evaluation will be an interesting future research. For e.g., if FPGAs are manufactured with sufficient amount of on-chip memory to store two ciphertexts, then the overhead of external memory access could be reduced significantly. Beside this, integration of a cache memory to the FPGA chip would enable prefetching of data from the slower external memory, and hence would reduce the time spent in the external communication. Protection against physical attacks: In this thesis we developed efficient algorithms and architectures for ring-LWE-based cryptographic schemes. Designing of countermeasures against side channel and fault attacks would be a very interesting future research. There are a few recent works, most of them investigate physical security in the presence of a chosen-plaintext attacker (CPA). It would be very interesting to investigate physical security in the presence of a more powerful chosen-ciphertext attacker (CCA). Effect of bias in randomness on Gaussian sampling: A discrete Gaussian sampler requires random numbers. Any bias in the randomness would result in a large statistical distance to the accurate Gaussian distribution, and this could be exploited by an attacker. Hence, it would be interesting to study the effect of a biased random number generator on the Gaussian sampling. 16
Curriculum Vitae 17
18
19
20
21
22
Laureates of the IBM Innovation Awards 1975 : Thierry BINGEN, ULB Joseph POELMANS, KU Leuven Baudouin LE CHARLIER, UNamur Marc VANDEN BEMPT, KU Leuven Maurice BRUYNOOGHE, KU Leuven Dirk VERMEIR, UAntwerpen Paul VAN DOOREN, KU Leuven Luc DE RIDDER, KU Leuven 1980 : Dominique LIENART, UCL Claude NYSSEN, ULg 1976 : Paul LEJEUNE, ULg Axel VAN LAMSWEERE, ULB René PENDVILLE, ULg Hubert CUYCKENS, UAntwerpen Karel DE VLAMINCK, KU Leuven Steven GILLIS, UAntwerpen Jean HUENS, KU Leuven Maurice BRUYNOOGHE, KU Leuven Patrick MERTENS, KU Leuven 1981 : Albert BRUFFAERTS, UCL 1977 : Alain KURINCKX, ULg Christian BROHET, UCL Charles LONCOUR, ULB Vera STOEFS, VUB Johan DECONINCK, VUB Elise DE DONCKER, KU Leuven Marc VAN OVERMEIRE, VUB Hendrik OLIVIE, UAntwerpen Pierre VERBAETEN, KU Leuven 1982 : Eric DUBOIS, UNamur 1978 : Colette GOSSART, UCL Hendrik VANTILBORGH, UGent Françoise CARETTE, ULB Paul DE BRA, UAntwerpen Jean-André ESSERS, ULg Marc GYSSENS, UAntwerpen Hans DE MEYER, UGent Rik VERSTRAETE, KU Leuven Franciscus DE SCHUTTER, UGent P. VAN DER CRUYSSEN, UAntwerpen 1979 : Bernard ROZENCWEIG, ULB 1983 : Jean-Michel VAN VYVE, UCL Joseph BREMER, ULg Jacques HAGELSTEIN, ULg Claude FLEURY, ULg Dirk VAN GUCHT, VUB 23
Ivo VAN HOREBEEK, KU Leuven Johan OPSOMMER, UGent Paul SUETENS, KU Leuven Philip RADEMAKERS, VUB Ferdinand PUT, KU Leuven 1984 : Vincent BODART, UNamur Guy VAN HOOVELD, ULB 1990 : Patrice GODEFROID, ULg Michel HAUTFENNE, ULB Anne ROUSSEAU, ULB Viviane JONCKERS, VUB Eric GREGOIRE, UCL Patricia MAES, VUB Rafael VAN DRIESSCHE, KU Leuven Dirk JANSSENS, UAntwerpen Frank PEETERS, UAntwerpen 1985 : Yves LEDRU, UMONS 1991 : Yves BAGUETTE, ULg Daniel JULIN, ULB Arnold GINETTI, UCL Paul-Henri HEENEN, ULB Guy LEDUC, ULg Catherine ERKELENS, VUB Ann SINAP, KU Leuven Yvo DESMEDT, KU Leuven Jan JANSSENS, VUB 1986 : Pierre HENQUET, ULg 1992 : Luc LEONARD, ULg Jean-Louis BINOT, ULg Marie-Jeanne TOUSSAINT, ULg Eddy AERTS, KU Leuven Wim MEES, RMA Koen JANSSENS, UAntwerpen Frank PIESSENS, KU Leuven Marc BARTHOLOMEUS, KU Leuven Bernard MANDERICK, VUB Marc GYSSENS, UAntwerpen Dirk VANDERMEULEN, KU Leuven 1987 : Louis WEHENKEL, ULg 1993 : Jean-Claude HEMMER, ULg Yves DEVILLE, UNamur Jean-Marie BECKERS, ULg Eddy DEBAERE, UGent Pierre SEMAL, UCL Paul DE BRA, UAntwerpen Kurt LUST, KU Leuven Peter JOHANNES, KU Leuven 1988 : François PICHAULT, ULg Koenraad DE BOSSCHERE, UGent 1994 : Vincent KIEFFER, ULg Walter VAN DE VELDE, VUB Benoît CHAMPAGNE, UNamur Serge GUTWIRTH, VUB 1989 : Luc MOREAU, ULg J. VAN DEN BUSSCHE, UAntwerpen Christian MELOT, ULB 24
1995 : Thang NGUYEN, UCL Gregory NEVEN, KU Leuven Pierre COLLETTE, UCL Sofia VERBAETEN, KU Leuven Luc MOREAU, ULg Geert UYTTERHOEVEN, KU Leuven 2002 : Nicolas BONMARIAGE, ULg Bernhard MARTENS, KU Leuven Sébastien JODOGNE, ULg Matthieu FERRANT, UCL 1996 : Xavier BOYEN, ULg Bernard WILLEMS, ULg 2003 : Bart ADAMS, KU Leuven Denis VANDERSTRAETEN, UCL Lieven EECKHOUT, UGent Maarten JANSEN, KU Leuven Gunther SABLON, KU Leuven 2004 : Eytan LEVY, ULB Virginie LOUSSE, UNamur 1997 : Renaud PAQUAY, UNamur Wim VANHOOF KU Leuven 2005 : Steve UHLIG, UCL Rudi VERBEECK, KU Leuven Hans VANDIERENDONCK, UGent 1998 : Olivier BARETTE, UCL 2006 : Jean-Charles DELVENNE, UCL Luc LEONARD, ULg Davy VAN NIEUWENBORGH, VUB Jean VANDERDONCKT, UNamur Geert VAN DER AUWERA, VUB 2007 : Hadrien MELOT, UMONS Johan VAN PRAET, KU Leuven Wim MARTENS, UHasselt 1999 : Pierre GEURTS, ULg 2008 : Gilles GEERAERTS, ULB Bernard BOIGELOT, ULg Axel LEGAY, ULg Lieven EECKHOUT, UGent Ares LAGAE, KU Leuven Bart KUIJPERS, KU Leuven 2009 : Raphaël JUNGERS, UCL 2000 : Laurent VAN BEGIN, ULB Steven SCHOCKAERT, UGent Frédéric NOO, ULg Toon CALDERS, UAntwerpen 2010 : Anthony CLEVE, UNamur Frank NEVEN, UAntwerpen Niels LANDWEHR, KU Leuven 2001 : Christophe LAURENT, ULB 2011 : Thibault HELLEPUTTE, UCL Wim MEES, RMA Bart GOOSSENS, UGent 25
2012 : Thomas DRUGMAN, UMONS Alexander BERTRAND, KU Leuven 2013 : Vân Anh HUYNH-THU, ULg Nikolaos DELIGIANNIS, VUB 2014 : Julie DE PRIL, UMONS Benoît FRENAY, UCL Guy VAN den BROECK, KU Leuven 2015 : Thomas PETERS, UCL Jo VERMEULEN, UHasselt 2016 : Sandrine BROGNAUX, UCL / UMONS Stijn VOLCKAERT, UGent 2017 : Vasiliki KALAVRI, KTH, Sweden / UCL Raf RAMAKERS, UHasselt 2018 : Adrien TAYLOR, UCL Sujoy SINHA ROY, KU Leuven 26
27
You can also read