During the first decade of this new millennium, it is estimated that more than EUR100 billion will be invested in the third generation (3G) Universal Mobile Telecommunications System (UMTS) in Europe. This fact represents an amazing challenge from both a technical and commercial perspective. Written by experts in the field, this book gives a detailed description of the elements in the UMTS network architecture: the User Equipment (UE), the UMTS Radio Access Network (UTRAN) and the core network. The completely new protocols based on the needs of the new Wideband Code Division Multiple Access (WCDMA) air interface are highlighted by considering both Frequency- and Time-Division Duplex modes. The book further introduces the key features of existing topics in Releases 5, 6 and 7.
This helpful resource covers a large range of information regarding electrical actuators. In particular, robustness, a very problematic issue, is fully explored in a dedicated chapter. The text also deals with he estimate of non-measurable mechanical variables by examining the estimate of load moment, then observation of the positioning of a command without mechanical sensor. Finally, it examines the conditions needed to measure variables and real implementation of numerical algorithms. This is a key working resource for electrical engineers.
This comprehensive scientific work embraces, within the generic theme of "educations, territorialities and territories", the vast majority of different facets of the complex relationships between educations and territories that have developed over time. It sheds an original light on the many - and, for some, new - interactions between territories-territories, on the one hand, and educations, on the other hand, which have recently been identified and analyzed. Beyond this main objective, it contributes to improving the fine and differentiated understanding of the concept of territory in the sciences of education and training and, more importantly, it brings innovative developments to the still embryonic theorization of the complex relations between educations. and territories-territorialities. This book shows, in particular, through its surveys, its analyzes and its results, that within all the multiple influences attributed to the different dimensions of the territories, the very discrete territoriality - falling within the symbolic territory - is perhaps finally the the most important territorial vector in terms of education in certain areas (rural Montagnards, for example), particularly as regards educational and vocational guidance, but not only. Lastly, it is not uninteresting to note that the theme it bears is spreading more and more today beyond scientific circles: the problem of inequalities in education and orientation of territorial origin is fueling - recently - the controversies and the reflections of the French educational policy which is thus sometimes echoed - in declarative terms essentially for the moment! - scientific advances in this area
Networks are now embedded in daily life thanks to smaller, faster, inexpensive components that are more powerful and increasingly connected. Parallel to this quantitative explosion of communication networks, technology has become more complex. This development comes with challenges related to management and control, and it has become necessary to manage the service level demands of the client to which the service provider commits. Different approaches to managing one or more service level components in different emerging environments are explored, such as: the Internet of Things, the Cloud, smart grids, e-health, mesh networking, D2D (Device to Device), smart cities and even green networking. This book therefore allows for a better understanding of the important challenges and issues relating to Quality of Service (QoS) management, security and mobility in these types of environment.
The book deals with requirements engineering in the context of System Engineering. He proposes a method to guide this activity engineering. The method is supported by the SysML modeling language. A first chapter aims to present the context and the associated definitions, to position the requirements engineering in the processes system engineering, to define the modeling and its contributions, and to make the link with the management of IS projects. The second chapter is devoted to the proposed method for implementing the requirements engineering subprocesses. Each of the 8 activities the component is first described before specifying how the SysML language can be exploited to achieve it effectively. Proposal for a book Please fill out the questionnaire below and send it back to Chantal Menascé: email@example.com The 3rd chapter is an application of the method to define the needs of the stakeholders of a system. The example is built on the basis of the RobAFIS'2018 competition. The 4th chapter continues the application of the method in the continuity of the IS processes to define the requirements of the same system. The appendices present at the same time a toolbox to realize the engineering of the requirements but also the complete results of engineering in Chapters 3 and 4.
The book studies the way the luxurious fashion develops re-presentational politics by reinvesting symbolic fields such as art and culture, religion and the sacred as well as politics, in other words fields that represent a certain common pattern of life and a common interest. I develop a semiotic approach of the way art exhibitions, print and audiovisual advertising, publishing and distribution politics as well as special ready to wear collaborations with arts such as Jeff Koons reveal the fashion industry's gesture of pretending being a non-commercial structure especially in order to cover up its industrialisation and banalization process
The humanities and social sciences are interested in the cybersecurity object since its emergence in the security debates, at the beginning of the 2000s. This scientific production is thus still relatively young, but diversified, mobilizing at the same time political science, international relations, sociology , law, information science, security studies, surveillance studies, strategic studies, polemology. There is, however, no actual cybersecurity studies. After two decades of scientific production on this subject, we thought it essential to take stock of the research methods that could be mobilized, imagined and invented by the researchers. The research methodology on the subject "cybersecurity" has, paradoxically, been the subject of relatively few publications to date. This dimension is essential. It is the initial phase by which any researcher, seasoned or young doctoral student, must pass, to define his subject of study, delimit the contours, ask the research questions, and choose the methods of treatment. It is this methodological dimension that our book proposes to treat. The questions the authors were asked to answer were: how can cybersecurity be defined? What disciplines in the humanities and social sciences are studying, and how, cybersecurity? What is the place of pluralism or interdisciplinarity? How are the research topics chosen, the questions defined? How, concretely, to study cybersecurity: tools, methods, theories, organization of research, research fields, data ...? How are discipline-specific theories useful for understanding and studying cybersecurity? Has cybersecurity had an impact on scientific theories?
In order to enable general understanding and to foster the implementation of necessary support measures in organizations, this book describes the fundamental and conceptual aspects of cyberspace abuse. These aspects are logically and reasonably discussed in the fields related to cybercrime and cyberwarfare. The book illustrates differences between the two fields, perpetrators' activities, as well as the methods of investigating and fighting against attacks committed by perpetrators operating in cyberspace.
The first chapter focuses on the understanding of cybercrime, i.e. the perpetrators, their motives and their organizations. Tools for implementing attacks are also briefly mentioned, however this book is not technical and does not intend to instruct readers about the technical aspects of cybercrime, but rather focuses on managerial views of cybercrime. Other sections of this chapter deal with the protection against attacks, fear, investigation and the cost of cybercrime. Relevant legislation and legal bodies, which are used in cybercrime, are briefly described at the end of the chapter.
The second chapter deals with cyberwarfare and explains the difference between classic cybercrime and operations taking place in the modern inter-connected world. It tackles the following questions: who is committing cyberwarfare; who are the victims and who are the perpetrators? Countries which have an important role in cyberwarfare around the world, and the significant efforts being made to combat cyberwarfare on national and international levels, are mentioned.
The common points of cybercrime and cyberwarfare, the methods used to protect against them and the vision of the future of cybercrime and cyberwarfare are briefly described at the end of the book.
Contents 1. Cybercrime.
2. Cyberwarfare. About the Authors Igor Bernik is Vice Dean for Academic Affairs and Head of the Information Security Lab at the University of Maribor, Slovenia. He has written and contributed towards over 150 scientific articles and conference papers, and co-authored 4 books. His current research interests concern information/cybersecurity, cybercrime, cyberwarfare and cyberterrorism.
More and more organizations are becoming aware of the importance of tacit and explicit knowledge owned by their members which corresponds to their experience and accumulated knowledge about the firm activities. However, considering the large amount of knowledge created and used in the organization, especially with the evolution of information and communications technologies, the firm must first determine the specific knowledge on which it is necessary to focus. Creating activities to enhance identification, preservation, and use of this knowledge is a powerful mean to improve the level of economical performance of the organization. Thus, companies invest on knowledge management programs, in order to develop a knowledge sharing and collaboration culture, to amplify individual and organizational learning, to make easier accessing and transferring knowledge, and to insure knowledge preservation. Several researches can be considered to develop knowledge management programs supported by information and knowledge systems, according to their context, their culture and the stakeholders' viewpoints.
In risk studies, engineers often have to consider the consequences of an accident leading to a shock on a construction. This can concern the impact of a ground vehicle or aircraft, or the effects of an explosion on an industrial site. This book presents a didactic approach starting with the theoretical elements of the mechanics of materials and structures, in order to develop their applications in the cases of shocks and impacts. The latter are studied on a local scale at first. They lead to stresses and strains in the form of waves propagating through the material, this movement then extending to the whole of the structure. The first part of the book is devoted to the study of solid dynamics where nonlinear behaviors come into play. The second part covers structural dynamics and the evaluation of the transient response introduced at the global scale of a construction. Practical methods, simplified methods and methods that are in current use by engineers are also proposed throughout the book. The aim of this book is to present theoretical elements regarding solids and structures, as well as modeling tools in order to study the vulnerability of a structure to a short duration action, generally of accidental nature. The book takes the point of view of an engineer seeking for the modeling of the physics at stake to relevantly carry out his study. The book originality is that it gathers elements from various fields of engineering sciences, for the purpose of a practical objective.
Bistatic radar consists of a radar system which comprises a transmitter and receiver which are separated by a distance comparable to the expected target distance. This book provides a general theoretical description of such bistatic technology in the context of synthetic aperture, inverse synthetic aperture and forward scattering radars from the point of view of analytical geometrical and signal formation as well as processing theory. Signal formation and image reconstruction algorithms are developed with the application of high informative linear frequency and phase code modulating techniques, and numerical experiments that confirm theoretical models are carried out. The authors suggest the program implementation of developed algorithms. A theoretical summary of the latest results in the field of bistatic radars is provided, before applying an analytical geometrical description of scenarios of bistatic synthetic aperture, inverse synthetic aperture and forward scattering radars with cooperative and non-cooperative transmitters. Signal models with linear frequency and phase code modulation are developed, and special phase modulations with C/A (coarse acquisition) and P (precision) of GPS satellite transmitters are considered. The authors suggest Matlab implementations of all geometrical models and signal formation and processing algorithms. Contents 1. Bistatic Synthetic Aperture Radar (BSAR) Survey.
2. BSAR Geometry.
3. BSAR Waveforms and Signal Models.
4. BSAR Image Reconstruction Algorithms.
5. Analytical Geometrical Determination of BSAR Resolution.
6. BSAR Experimental Results.
7. BSAR Matlab Implementation. A general theoretical description of bistatic technology within the scope of synthetic aperture, inverse synthetic aperture and forward scattering radars from the point of view of analytical geometrical and signal formation and processing theory. Signal formation and image reconstruction algorithms are developed in this title, with application of high informative linear frequency and phase code modulating techniques. Numerical experiments that confirm theoretical models are carried out and the authors suggest program implementation for the algorithms developed.
Most books dedicated to the issues of bio-sensing are organized by the well-known scheme of a biosensor. In this book, the authors have deliberately decided to break away from the conventional way of treating biosensing research by uniquely addressing biomolecule immobilization methods on a solid surface, fluidics issues and biosensing-related transduction techniques, rather than focusing simply on the biosensor. The aim is to provide a contemporary snapshot of the biosensing landscape without neglecting the seminal references or products where needed, following the downscaling (from the micro- to the nanoscale) of biosensors and their respective best known applications. To conclude, a brief overview of the most popularized nanodevices applied to biology is given, before comparing biosensor criteria in terms of targeted applications.
Crowdsourcing is a relatively recent phenomenon that only appeared in 2006, but it continues to grow and diversify (crowdfunding, crowdcontrol, etc.). This book aims to review this concept and show how it leads to the creation of value and new business opportunities.
Chapter 1 is based on four examples: the online-banking sector, an informative television channel, the postal sector and the higher education sector. It shows that in the current context, for a company facing challenges, the crowd remains an untapped resource. The next chapter presents crowdsourcing as a new form of externalization and offers definitions of crowdsourcing. In Chapter 3, the authors attempt to explain how a company can create value by means of a crowdsourcing operation. To do this, authors use a model linking types of value, types of crowd, and the means by which these crowds are accessed.
Chapter 4 examines in detail various forms that crowdsourcing may take, by presenting and discussing ten types of crowdsourcing operation. In Chapter 5, the authors imagine and explore the ways in which the dark side of crowdsourcing might be manifested and Chapter 6 offers some insight into the future of crowdsourcing. Contents 1. A Turbulent and Paradoxical Environment.
2. Crowdsourcing: A New Form of Externalization.
3. Crowdsourcing and Value Creation.
4. Forms of Crowdsourcing.
5. The Dangers of Crowdsourcing.
6. The Future of Crowdsourcing. About the Authors Jean-Fabrice Lebraty is Professor of management sciences at IAE (Business School) at Jean Moulin - Lyon 3 University in France and a member of the research laboratory Magellan EA3713. He specializes in the management of information and communication systems and his research notably concerns decision-making and the links between crowd and information technology.
Katia Lobre-Lebraty is Associate Professor of management sciences at IAE (Business School) at Jean Moulin - Lyon 3 University in France and a member of the research laboratory Magellan EA3713. She specializes in management control and strategic management and her research concerns both the modes of governance of organizations and Open Data
Since Lord Rayleigh introduced the idea of viscous damping in his classic work "The Theory of Sound" in 1877, it has become standard practice to use this approach in dynamics, covering a wide range of applications from aerospace to civil engineering. However, in the majority of practical cases this approach is adopted more for mathematical convenience than for modeling the physics of vibration damping.
Over the past decade, extensive research has been undertaken on more general "non-viscous" damping models and vibration of non-viscously damped systems. This book, along with a related book Structural Dynamic Analysis with Generalized Damping Models: Analysis, is the first comprehensive study to cover vibration problems with general non-viscous damping. The author draws on his considerable research experience to produce a text covering: parametric senistivity of damped systems; identification of viscous damping; identification of non-viscous damping; and some tools for the quanitification of damping. The book is written from a vibration theory standpoint, with numerous worked examples which are relevant across a wide range of mechanical, aerospace and structural engineering applications.
Contents 1. Parametric Sensitivity of Damped Systems.
2. Identification of Viscous Damping.
3. Identification of Non-viscous Damping.
4. Quantification of Damping. About the Authors Sondipon Adhikari is Chair Professor of Aerospace Engineering at Swansea University, Wales. His wide-ranging and multi-disciplinary research interests include uncertainty quantification in computational mechanics, bio- and nanomechanics, dynamics of complex systems, inverse problems for linear and nonlinear dynamics, and renewable energy. He is a technical reviewer of 97 international journals, 18 conferences and 13 funding bodies.He has written over 180 refereed journal papers, 120 refereed conference papers and has authored or co-authored 15 book chapters.
The authors describe a technique that can visualize the atomic structure of molecules, it is necessary, in terms of the image processing, to consider the reconstruction of sparse images. Many works have leveraged the assumption of sparsity in order to achieve an improved performance that would not otherwise be possible.
For nano MRI, the assumption of sparsity is given by default since, at the atomic scale, molecules are
sparse structures. This work reviews the latest results on molecular imaging for nano MRI. Sparse image reconstruction methods can be categorized as either non-Bayesian or Bayesian. A comparison of the performance and complexity of several such algorithms is given.
The rapid developments in magnetic resonance imaging (MRI) over the past 20 years have affirmed its supremacy over most other means of non-invasive exploration of the human body. This progress has had other consequences for imaging physicists: having knowledge about only one of the sides of MRI is nowadays no longer enough to develop new sequences or even to learn more about those that already exist. It is necessary to have a clear and precise view of all the fields explored today by this imaging technique, such as rapid imaging, flows, diffusion, perfusion or even functional MRI.
This book aims to allow readers with the basics of physics and mathematics within the field MRI to easily immerse themselves in techniques that are not familiar to them. Pragmatic in approach, moving between the physics underlying the techniques being studied and the clinical examination of images, it will also be of interest to radiologists looking to define protocols or make better use of the images obtained.
Contents 1. Flow.
4. Functional MRI. About the Authors Vincent Perrin is a specialist teacher in the fields of physics and chemistry.
This title focuses on two significant problems in the field of automatic control, in particular state estimation and robust Model Predictive Control under input and state constraints, bounded disturbances and measurement noises. The authors build upon previous results concerning zonotopic set-membership state estimation and output feedback tube-based Model Predictive Control. Various existing zonotopic set-membership estimation methods are investigated and their advantages and drawbacks are discussed, making this book suitable both for researchers working in automatic control and industrial partners interested in applying the proposed techniques to real systems.
The authors proceed to focus on a new method based on the minimization of the P-radius of a zonotope, in order to obtain a good trade-off between the complexity and the accuracy of the estimation. They propose a P-radius based set-membership estimation method to compute a zonotope containing the real states of a system, which are consistent with the disturbances and measurement noise. The problem of output feedback control using a zonotopic set-membership estimation is also explored. Among the approaches from existing literature on the subject, the implementation of robust predictive techniques based on tubes of trajectories is developed.
Contents 1. Uncertainty Representation Based on Set Theory.
2. Several Approaches on Zonotopic Guaranteed Set-Membership Estimation.
3. Zonotopic Guaranteed State Estimation Based on P-Radius Minimization.
4. Tube Model Predictive Control Based on Zonotopic Set-Membership Estimation. About the Authors Vu Tuan Hieu Le is a Research Engineer at the IRSEEM/ESIGELEC Technopôle du Madrillet, Saint Etienne du Rouvray, France.
Cristina Stoica is Assistant Professor in the Automatic Control Department at SUPELEC Systems Sciences (E3S), France.
Teodoro Alamo is Professor in the Department of Systems Engineering and Automatic Control at the University of Seville, Spain.
Eduardo F. Camacho is Professor in the Department of Systems Engineering and Automatic Control at the University of Seville, Spain.
Didier Dumur is Professor in the Automatic Control Department, SUPELEC Systems Sciences (E3S), France.
This book targets new trends in microwave engineering by downscaling components and devices for industrial purposes such as miniaturization and function densification, in association with the new approach of activation by a confined optical remote control. It covers the fundamental groundwork of the structure, property, characterization methods and applications of 1D and 2D nanostructures, along with providing the necessary knowledge on atomic structure, how it relates to the material band-structure and how this in turn leads to the amazing properties of these structures. It thus provides new graduates, PhD students and post-doctorates with a resource equipping them with the knowledge to undertake their research.
The classic approach in Automatic Control relies on the use of simplified models of the systems and reformulations of the specifications. In this framework, the control law can be computed using deterministic algorithms. However, this approach fails when the system is too complex for its model to be sufficiently simplified, when the designer has many constraints to take into account, or when the goal is not only to design a control but also to optimize it. This book presents a new trend in Automatic Control with the use of metaheuristic algorithms. These kinds of algorithm can optimize any criterion and constraint, and therefore do not need such simplifications and reformulations.
The first chapter outlines the author's main motivations for the approach which he proposes, and presents the advantages which it offers. In Chapter 2, he deals with the problem of system identification. The third and fourth chapters are the core of the book where the design and optimization of control law, using the metaheuristic method (particle swarm optimization), is given. The proposed approach is presented along with real-life experiments, proving the efficiency of the methodology. Finally, in Chapter 5, the author proposes solving the problem of predictive control of hybrid systems. Contents 1. Introduction and Motivations.
2. Symbolic Regression.
3. PID Design Using Particle Swarm Optimization.
4. Tuning and Optimization of H-infinity Control Laws.
5. Predictive Control of Hybrid Systems. About the Authors Guillaume Sandou is Professor in the Automatic Department of Supélec, in Gif Sur Yvette, France. He has had 12 books, 8 journal papers and 1 patent published, and has written papers for 32 international conferences.His main research interests include modeling, optimization and control of industrial systems; optimization and metaheuristics for Automatic Control; and constrained control.
This book presents the basics of the non-invasive geophysical method for groundwater investigation, called Magnetic Resonance Sounding (MRS) or Surface Nuclear Magnetic Resonance (SNMR), and its practical application to the problems of groundwater localization and aquifer characterization. The method is based on the nuclear magnetic resonance (NMR) phenomenon and is selectively sensitive to groundwater. The main aims of the author are to teach the reader the basic principles of the method as well as to formulate consistent approximate models, leading to reasonably simple inverse problems. Containing an extensive bibliography, numerous practical and numerical examples as well as a detailed presentation of the nuts and bolts of the method based on the long-term experience of SNMR development and practical use, this book is useful for students, scientists and professional engineers working in the field of hydrogeophysics and hydrogeology. Contents 1. SNMR Imaging for Groundwater.
2. The Basics of NMR.
3. Forward Modeling.
5. Link Between SNMR and Aquifer Parameters.
This book shows how dispersion engineering in two dimensional dielectric photonic crystals can provide new effects for the precise control of light propagation for integrated nanophotonics.
Dispersion engineering in regular and graded photonic crystals to promote anomalous refraction effects is studied from the concepts to experimental demonstration via nanofabrication considerations. Self collimation, ultra and negative refraction, second harmonic generation, mirage and invisibility effects which lead to an unprecedented control of light propagation at the (sub-)wavelength scale for the field of integrated nanophotonics are detailed and commented upon.
This book includes a study of trustworthiness, percentile response time, service availability, and authentication in the networks between users and cloud service providers, and at service stations or sites that may be owned by different service providers. The first part of the book contains an analysis of percentile response time, which is one of the most important SLA (service level agreements) metrics. Effective and accurate numerical solutions for the calculation of the percentile response time in single-class and multi-class queueing networks are obtained. Then, the numerical solution is incorporated in a resource allocation problem. Specifically, the authors present an approach for the resource optimization that minimizes the total cost of computer resources required while preserving a given percentile of the response time. In the second part, the approach is extended to consider trustworthiness, service availability, and the percentile of response time in Web services. These QoS metrics are clearly defined and their quantitative analysis provided. The authors then take into account these QoS metrics in a trust-based resource allocation problem in which a set of cloud computing resources is used by a service provider to host a typical Web services application for single-class customer services and multipleclass customer services respectively. Finally, in the third part of the book a thorough performance evaluation of two notable public key cryptography-based authentication techniques; Public-Key Cross Realm Authentication in Kerberos (PKCROSS) and Public Key Utilizing Tickets for Application Servers (PKTAPP, a.k.a. KX.509/KCA); is given, in terms of computational and communication times. The authors then demonstrate their performance difference using queuing networks. PKTAPP has been proposed to address the scalability issue of PKCROSS. However, their in-depth analysis of these two techniques shows that PKTAPP does not perform better than PKCROSS in a large-scale system. Thus, they propose a new public key cryptography-based group authentication technique. The performance analysis demonstrates that the new technique can scale better than PKCORSS and PKTAPP.
During the reception of a piece of information, we are never passive. Depending on its origin and content, from our personal beliefs and convictions, we bestow upon this piece of information, spontaneously or after reflection, a certain amount of confidence. Too much confidence shows a degree of naivety, whereas an absolute lack of it condemns us as being paranoid. These two attitudes are symmetrically detrimental, not only to the proper perception of this information but also to its use. Beyond these two extremes, each person generally adopts an intermediate position when faced with the reception of information, depending on its provenance and credibility. We still need to understand and explain how these judgements are conceived, in what context and to what end.
Spanning the approaches offered by philosophy, military intelligence, algorithmics and information science, this book presents the concepts of information and the confidence placed in it, the methods that militaries, the first to be aware of the need, have or should have adopted, tools to help them, and the prospects that they have opened up. Beyond the military context, the book reveals ways to evaluate information for the good of other fields such as economic intelligence, and, more globally, the informational monitoring by governments and businesses.
Contents 1. Information: Philosophical Analysis and Strategic Applications, Mouhamadou El Hady Ba and Philippe Capet.
2. Epistemic Trust, Gloria Origgi.
3. The Fundamentals of Intelligence, Philippe Lemercier.
4. Information Evaluation in the Military Domain: Doctrines, Practices and Shortcomings, Philippe Capet and Adrien Revault d'Allonnes.
5. Multidimensional Approach to Reliability Evaluation of Information Sources, Frédéric Pichon, Christophe Labreuche, Bertrand Duqueroie and Thomas Delavallade.
6. Uncertainty of an Event and its Markers in Natural Language Processing,
Mouhamadou El Hady Ba, Stéphanie Brizard, Tanneguy Dulong and Bénédicte Goujon.
7. Quantitative Information Evaluation: Modeling and Experimental Evaluation,
Marie-Jeanne Lesot, Frédéric Pichon and Thomas Delavallade.
8. When Reported Information Is Second Hand, Laurence Cholvy.
9. An Architecture for the Evolution of Trust: Definition and Impact of the Necessary Dimensions of Opinion Making, Adrien Revault d'Allonnes. About the Authors Philippe Capet is a project manager and research engineer at Ektimo, working mainly on information management and control in military contexts.
Thomas Delavallade is an advanced studies engineer at Thales Communications & Security, working on social media mining in the context of crisis management, cybersecurity and the fight against cybercrime.
This book presents digital encoders for data communications. After an introduction on data communications and different sequences, the authors present the frey encoder as a digital filter followed by the trellis-coded and parallel turbo trellis-coded modulation schemes using nonlinear digital encoders.
The book contains many numerical examples that complete the description of the analyzed schemes. Also, some performance simulation results are provided. Appendixes include demonstrations for the mathematical apparatus used throughout the book and some Matlab/Simulink source files used to run the simulations. Therefore, students can easily understand the concepts presented in the book and to simulate the schemes.