L'archive ouverte pluridisciplinaire HAL , est destinée au dépôt et à la diffusion d'articles scientifiques de niveau recherche, publiés ou non, et de thèses, émanant des établissements d'enseignement et de recherche français ou étrangers, des laboratoires publics ou privés.

Derniers Dépôts
Chimie

Économie et finance quantitative

Informatique

Mathématiques

Physique

Planète et Univers

Science non linéaire

Sciences cognitives

Sciences de l'environnement

Sciences de l'Homme et Société

Sciences de l'ingénieur

Sciences du Vivant

Statistiques

Two methods of immobilization of the butyric 4-oxocyclopentadienylmanganese tricarbonyl acid I on gold surface are presented: either through the assembly of a disulfide derived from the acid, or by coupling on a cystamine monolayer; the efficiency of these two ways are evaluated by IR reflection spectroscopy and X-ray photoelectron spectroscopy.

n.a.

Langmuir-Blodgett homolayers are formed by deposition of semi-amphiphilic porphyrin-phthalocyanine heterodimers. The optical and photophysical properties of these dimers have been investigated and compared to the liquid-phase data. Excitation of the dimer results in an instantaneous formation of the singlet excited states, followed by a very efficient charge-transfer reaction. The oxidized porphyrin and reduced phthalocyanine moieties are formed within 2 ps and disappear in 70 ps. The triplet excited states of the porphyrin issued from the intersystem crossing decay pathway of the singlet excited states are formed with a very low quantum yield. They also undergo a charge-transfer reaction, leading to the formation of long-lived transient charge carriers. The photoprocesses determined in the Langmuir-Blodgett films of semi-amphiphilic porphyrin-phthalocyanine heterodimers are almost identical to those previously observed for the same dimers in the liquid phase.

This survey reviews portfolio selection problem for long-term horizon. We consider two objectives: (i) maximize the probability for outperforming a target growth rate of wealth process (ii) minimize the probability of falling below a target growth rate. We study the asymptotic behavior of these criteria formulated as large deviations control pro\-blems, that we solve by duality method leading to ergodic risk-sensitive portfolio optimization problems. Special emphasis is placed on linear factor models where explicit solutions are obtained.

We study a mean-field version of rank-based models of equity markets such as the Atlas model introduced by Fernholz in the framework of Stochastic Portfolio Theory. We obtain an asymptotic description of the market when the number of companies grows to infinity. Then, we discuss the long-term capital distribution. We recover the Pareto-like shape of capital distribution curves usually derived from empirical studies, and provide a new description of the phase transition phenomenon observed by Chatterjee and Pal. Finally, we address the performance of simple portfolio rules and highlight the influence of the volatility structure on the growth of portfolios.

In the deterministic context a series of well established results allow to reformulate delay differential equations (DDEs) as evolution equations in infinite dimensional spaces. Several models in the theoretical economic literature have been studied using this reformulation. On the other hand, in the stochastic case only few results of this kind are available and only for specific problems. The contribution of the present letter is to present a way to reformulate in infinite dimension a prototype controlled stochastic DDE, where the control variable appears delayed in the diffusion term. As application, we present a model for quadratic risk minimization hedging of European options with execution delay and a time-to-build model with shock. Some comments concerning the possible employment of the dynamic programming after the reformulation in infinite dimension conclude the letter.

In this report, we study the issue of delay optimization and energy efficiency in grid wireless sensor networks (WSNs). We focus on STDMA (Spatial Reuse TDMA)) scheduling, where a predefined cycle is repeated, and where each node has fixed transmission opportunities during specific slots (defined by colors). We assume a STDMA algorithm that takes advantage of the regularity of grid topology to also provide a spatially periodic coloring ("tiling" of the same color pattern). In this setting, the key challenges are: 1) minimizing the average routing delay by ordering the slots in the cycle 2) being energy efficient. Our work follows two directions: first, the baseline performance is evaluated when nothing specific is done and the colors are randomly ordered in the STDMA cycle. Then, we propose a solution, ORCHID that deliberately constructs an efficient STDMA schedule. It proceeds in two steps. In the first step, ORCHID starts form a colored grid and builds a hierarchical routing based on these colors. In the second step, ORCHID builds a color ordering, by considering jointly both routing and scheduling so as to ensure that any node will reach a sink in a single STDMA cycle. We study the performance of these solutions by means of simulations and modeling. Results show the excellent performance of ORCHID in terms of delays and energy compared to a shortest path routing that uses the delay as a heuristic. We also present the adaptation of ORCHID to general networks under the SINR interference model.

This paper describes several results of Wimmics, a research lab which names stands for: web-instrumented man-machine interactions, communities, and semantics. The approaches introduced here rely on graph-oriented knowledge representation, reasoning and operationalization to model and support actors, actions and interactions in web-based epistemic communities. The re-search results are applied to support and foster interactions in online communities and manage their resources.

Triple therapy combining a protease inhibitor (PI) (telaprevir or boceprevir), pegylated interferon (PEG-IFN), and ribavirin (RBV) has dramatically increased the chance of eradicating hepatitis C virus (HCV). However, the efficacy of this treatment remains suboptimal in cirrhotic treatment-experienced patients. Here, we aimed to better understand the origin of this impaired response by estimating the antiviral effectiveness of each drug. Fifteen HCV genotype 1-infected patients with compensated cirrhosis, who were nonresponders to prior PEG-IFN/RBV therapy, were enrolled in a nonrandomized study. HCV RNA and concentrations of PIs, PEG-IFN, and RBV were frequently assessed in the first 12 weeks of treatment and were analyzed using a pharmacokinetic/viral kinetic model. The two PIs achieved similar levels of molar concentrations (P = 0.5), but there was a significant difference in the 50% effective concentrations (EC50) (P = 0.008), leading to greater effectiveness for telaprevir than for boceprevir in blocking viral production (99.8% versus 99.0%, respectively, P = 0.002). In all patients, the antiviral effectiveness of PEG-IFN was modest (43.4%), and there was no significant contribution of RBV exposure to the total antiviral effectiveness. The second phase of viral decline, which is attributed to the loss rate of infected cells, was slow (0.19 day(-1)) and was higher in patients who subsequently eradicated HCV (P = 0.03). The two PIs achieved high levels of antiviral effectiveness. However, the suboptimal antiviral effectiveness of PEG-IFN/RBV and the low loss of infected cells suggest that a longer treatment duration might be needed in cirrhotic treatment-experienced patients and that a future IFN-free regimen may be particularly beneficial in these patients.

Après avoir rappelé le contexte de la construction des premières tables de trigonométrie à l'époque de Ptolémée et la synthèse qu'en fit Carnot au début du 19ème siècle, le présent article évoque trois types de démonstrations du théorème de Pythagore. Il montre comment la notion de table de trigonométrie pythagoricienne introduite ici permet d'interpréter la fameuse tablette babylonienne Plimton 322, et il en évoque la précision stupéfiante. Abordant la question de la mesure des angles, il montre autour du mécanisme d'Anticythère comment les grecs ont certainement utilisé plus d'outils que la règle et le compas seuls, notamment car ils ont modélisé les trajectoires des planètes avec des épicycloïdes. Ces deux outils permettent cependant de construire un pentagone régulier. L'article montre l'équivalence par ces outils de la mesure de l'unité ou de celle du nombre d'or. Il rappelle des triplets pythagoriciens construits avec les nombres de Fibonacci. Revenant sur les démonstrations par puzzles, il montre comment certaines illusions d'optique peuvent en résulter, mais comment elles donnent aussi certains jeux d'esprit comme le Stomachion d'Archimède. A l'époque contemporaine, ces jeux sont toujours sérieusement d'actualité, et avec la construction en moulin à vent ils ont permis d'exhiber de nouveaux pavages du plan et des fractales. L'article rappelle à la fin comment tous les triangles pythagoriciens sont déductibles de façon unique du triplet (3,4,5) par une construction arborescente qui éclaire la tablette Plimton 322 d'un jour nouveau. Il se termine sur l'évocation d'un résultat de Stewart généralisant le théorème de Pythagore.

In this paper, we show that half of non-zero coefficients of the spinor zeta function of a Siegel cusp form of genus 2 are positive and half are negative.

We present a direct proof of Malus' theorem in geometrical Optics founded on the symplectic structure of the set of all oriented straight lines in an Euclidean affine space. Nous présentens une preuve directe du théorème de Malus de l'optique géométrique basée sur la structure symplectique de l'ensemble des droites orientées d'un espace affine euclidien.

We present a direct proof of Malus' theorem in geometrical Optics founded on the symplectic structure of the set of all oriented straight lines in an Euclidean affine space. Nous présentens une preuve directe du théorème de Malus de l'optique géométrique basée sur la structure symplectique de l'ensemble des droites orientées d'un espace affine euclidien.

Field-, temperature- and angle-dependent Fourier amplitude of de Haas-van Alphen (dHvA) oscillations are calculated for compensated two-dimensional (2D) metals with textbook Fermi surface (FS) composed of one hole and two electron orbits connected by magnetic breakdown. It is demonstrated that, taking into account the opposite sign of electron and hole orbits, a given Fourier component involves combination of several orbits, the contribution of which must be included in the calculations. Such FS is observed in the strongly 2D organic metal alpha-'pseudo-kappa'-(ET)4H3O[Fe(C2O4)3].(C6H4Br2), dHvA oscillations of which have been studied up to 55 T for various directions of the magnetic field with respect to the conducting plane. Calculations are in good quantitative agreement with the data.

Stability theory based on a variational principle and finite-time direct-adjoint optimization commonly relies on the kinetic perturbation energy density E-1(t ) = (1/V-Omega) integral(Omega) e(x, t) d Omega (where e(x, t) = vertical bar u vertical bar(2)/2) as a measure of disturbance size. This type of optimization typically yields optimal perturbations that are global in the fluid domain Omega of volume V-Omega. This paper explores the use of p-norms in determining optimal perturbations for 'energy' growth over prescribed time intervals of length T. For p = 1 the traditional energy-based stability analysis is recovered, while for large p >> 1, localization of the optimal perturbations is observed which identifies confined regions, or 'hotspots', in the domain where significant energy growth can be expected. In addition, the p-norm optimization yields insight into the role and significance of various regions of the flow regarding the overall energy dynamics. As a canonical example, we choose to solve the infinity-norm optimal perturbation problem for the simple case of two-dimensional channel flow. For such a configuration, several solutions branches emerge, each of them identifying a different energy production zone in the flow: either the centre or the walls of the domain. We study several scenarios (involving centre or wall perturbations) leading to localized energy production for different optimization time intervals. Our investigation reveals that even for this simple two-dimensional channel flow, the mechanism for the production of a highly energetic and localized perturbation is not unique in time. We show that wall perturbations are optimal (with respect to the infinity-norm) for relatively short and long times, while the centre perturbations are preferred for very short and intermediate times. The developed p-norm framework is intended to facilitate worst-case analysis of shear flows and to identify localized regions supporting dominant energy growth.

Imaging polarimetry is an important tool for the study of cosmic magnetic fields. In our Galaxy, polarization levels of a few up to $\sim$10\% are measured in the submillimeter dust emission from molecular clouds and in the synchrotron emission from supernova remnants. Only few techniques exist to image the distribution of polarization angles, as a means of tracing the plane-of-sky projection of the magnetic field orientation. At submillimeter wavelengths, polarization is either measured as the differential total power of polarization-sensitive bolometer elements, or by modulating the polarization of the signal. Bolometer arrays such as LABOCA at the APEX telescope are used to observe the continuum emission from fields as large as $\sim0\fdg2$ in diameter. %Here we present the results from the commissioning of PolKa, a polarimeter for Here we present PolKa, a polarimeter for LABOCA with a reflection-type waveplate of at least 90\% efficiency. The modulation efficiency depends mainly on the sampling and on the angular velocity of the waveplate. For the data analysis the concept of generalized synchronous demodulation is introduced. The instrumental polarization towards a point source is at the level of $\sim0.1$\%, increasing to a few percent at the $-10$db contour of the main beam. A method to correct for its effect in observations of extended sources is presented. Our map of the polarized synchrotron emission from the Crab nebula is in agreement with structures observed at radio and optical wavelengths. The linear polarization measured in OMC1 agrees with results from previous studies, while the high sensitivity of LABOCA enables us to also map the polarized emission of the Orion Bar, a prototypical photon-dominated region.

Les vagues jouent un rôle dominant aussi bien dans l'évolution morphologique des littoraux que sur les phénomènes de submersion. Cet article vise à donner une meilleure connaissance des évènements de forte houle (nombre, période de retour, emprise) à l'échelle pluri-décennale (période janvier 1958 - août 2002). A cet effet, la base de données BoBWA-10kH est analysée pour en déduire des statistiques d'extrêmes le long des côtes atlantique et Manche métropolitaines, permettant ainsi d'estimer la période de retour de hauteurs de vagues observées (Tr(Hs)). Les évènements majeurs de fortes houles sont présentés et analysés. Ainsi, 7 évènements présentant des hauteurs de houle de période de retour supérieure à 50 ans et affectant les différentes côtes françaises sont identifiées. L'emprise des évènements sur le linéaire côtier est également observée (extension des conditions de houles Tr(Hs)> 10 ans). Les extensions spatiales les plus fortes sont relevées en décembre 1979 (≈ 950 km) et février 1989 (≈ 650 km). En outre, cette analyse met en évidence des périodes remarquables en termes de succession d'évènements de fortes houles à quelques semaines voire quelques jours d'écart, comme en 1965 et 1989-1990. En termes de caractérisation d'évènements marquants, les périodes de retour des vagues lors des évènements Lothar et Martin sont aussi analysées, mettant en évidence des hauteurs de vagues relativement modérées lors de ces évènements (Tr(Hs)<10 ans). Enfin, l'utilisation de la base de données BoBWA-X est discutée pour l'analyse d'évènements postérieurs à 2002.

This paper presents a multicontinuum approach to model fractal temporal scaling of catchment response in hydrological systems. The temporal scaling of discharge is quantified in frequency domain by the transfer function HðxÞ, which is defined as the ratio between the spectra of catchment response and recharge time series. The transfer function may scale with frequency x as HðxÞ x2b. While the classical linear and Dupuit models predict exponents of b52 and b51, observations indicate scalings with noninteger exponents b. Such behaviors have been described by multifractal models, which, however, often lack a relation to the medium characteristics. We revisit and extend the classical linear Dupuit aquifer models and discuss their physical meanings in the light of the resulting aquifer dynamics. On the basis of these classical models, we derive a multicontinuum approach that provides physical recharge models which are able to explain fractal behaviors with exponents 1=2 < b < 2. Furthermore, this approach allows to link the fractal dynamics of the discharge process to the physical aquifer characteristics as reflected in the distribution of storage time scales. We systematically analyze the catchment responses in the proposed multicontinuum models, and identify and quantify the time scales which characterize the dynamics of the catchment response to recharge.

Research on asynchronous cellular automata has received a great amount of attention these last years and has turned to a thriving field. We survey the recent research that has been carried out on this topic and present a wide state of the art where computing and modelling issues are both represented.

This study deals with the evolution of the so called "intelligent" networks (insect society without leader, cells of an organism, brain,...) during their apprenticeship period. First we summarize briefly the Version 2 (published in French), whose the main characteristics are: 1) A network connected to its environment is considered as immersed into an information field created by this environment which so dictates to it the apprenticeship constraints. 2) The used formalism draws one's inspiration from the one of the Quantum field theory (Principle of stationary action, gauge fields, invariance by symmetry transformations,...). 3) We obtain Lagrange equations whose solutions describe the network evolution during the whole apprenticeship period. 4) Then, while proceeding with the same formalism inspiration, we suggest other study ways capable of evolving the knowledge in the considered scope. In a second part, after a reminder of the points to be improved, we exhibit the Version 3 which brings, we think, relevant improvements. Indeed: 5) We consider the weighted averages of the variables; this introduces probabilities. 6) We define two observables (L average of information flux and A activity of the network) which could be measured and so be compared with experimental results. 7) We find that L , weighted average of information flows, is an invariant. 8) Finally, we propose two expressions for the conactance, from which we deduce the corresponding Lagrange equations which have to be solved to know the evolution of the considered weighted averages. But, at the present stage, we think that we can progress only by carrying out experiments (see projects like Human brain project) and discovering invariants, symmetries which would allow us, like in Physics, to classify networks and above all to understand better the connections between them. Indeed, and that is what we propose among the future research ways, the underlying problem is to understand how, after their apprenticeship period, several networks can connect together to produce, in the brain case for instance, what we call mental states.

The collective behaviour of soliton ensembles (i.e. the solitonic gas) is studied using the methods of the direct numerical simulation. Traditionally this problem was addressed in the context of integrable models such as the celebrated KdV equation. We extend this analysis to non-integrable KdV-BBM type models. Some high resolution numerical results are presented in both integrable and nonintegrable cases. Moreover, the free surface elevation probability distribution is shown to be quasi-stationary. Finally, we employ the asymptotic methods along with the Monte-Carlo simulations in order to study quantitatively the dependence of some important statistical characteristics (such as the kurtosis and skewness) on the Stokes-Ursell number (which measures the relative importance of nonlinear effects compared to the dispersion) and also on the magnitude of the BBM term.

What is a map? What is its utility? What is a location, a behaviour? What are navigation, localization and prediction for a mobile robot facing a given task ? These questions have neither unique nor straightforward answer to this day, and are still the core of numerous research domains. Robotics, for instance, aim at answering them for creating successful sensori-motor artefacts. Cognitive sciences use these questions as intermediate goals on the road to un- derstanding living beings, their skills, and furthermore, their intelligence. Our study lies between these two domains. We first study classical probabilistic ap- proaches (Markov localization, POMDPs, HMMs, etc.), then some biomimetic approaches (Berthoz, Franz, Kuipers). We analyze their respective advantages and drawbacks in light of a general formalism for robot programming based on bayesian inference (BRP). We propose a new probabilistic formalism for modelling the interaction between a robot and its environment : the Bayesian map. In this framework, defining a map is done by specifying a particular probability distri- bution. Some of the questions above then amount to solving inference problems. We define operators for putting maps together, so that " hierarchies of maps " and incremental development play a central role in our formalism, as in biomimetic approaches. By using the bayesian formalism, we also benefit both from a unified means of dealing with uncertainties, and from clear and rigorous mathematical foundations. Our formalism is illustrated by experiments that have been implemented on a Koala mobile robot.

In this paper, we put forward a computational framework for the comparison between motor, auditory, and perceptuo-motor theories of speech communication. We first recall the basic arguments of these three sets of theories, either applied to speech perception or to speech production. Then we expose a unifying Bayesian model able to express each theory in a probabilistic way. Focusing on speech perception, we demonstrate that under two hypotheses, regarding communication noise and inter-speaker variability, providing perfect conditions for speech communication, motor, and auditory theories are indistinguishable. We then degrade successively each hypothesis to study the distinguish- ability of the different theories in ''adverse'' conditions. We first present simulations on a simplified implementation of the model with mono-dimensional sensory and motor variables, and secondly we consider a simulation of the human vocal tract providing more realistic auditory and articulatory variables. Simulation results allow us to emphasise the respective roles of motor and auditory knowledge in various conditions of speech perception in adverse conditions, and to suggest some guidelines for future studies aiming at assessing the role of motor knowledge in speech perception.

Recently, researchers have proposed various intersection management techniques that enable autonomous vehicles to cross the intersection without traffic lights or stop signs. In particular, a priority-based coordination system with provable collision-free and deadlock-free features has been presented. In this paper, we extend the priority-based approach to support legacy vehicles without compromising above-mentioned features. We make the hypothesis that legacy vehicles are able to keep a safe distance from their leading vehicles. Then we explore some special configurations of system that ensures the safe crossing of legacy vehicles. We implement the extended system in a realistic traffic simulator SUMO. Simulations are performed to demonstrate the safety of the system.

La región del Urabá-Darién, localizada en el noroccidente de Colombia, es una de las áreas del mundo con mayor riqueza y diversidad biótica. Sin embargo, durante los últimos años esta zona ha sido el epicentro de una grave crisis humanitaria y su ecosistema se encuentra en peligro debido a las perspectivas de industrialización del área. El Urabá se ha transformado en sede privilegiada para la realización de proyectos y megaproyectos con un fuerte impacto socio-ambiental. El objetivo manifiesto de estas empresas es aquello de "destapar" el Darién y franquear así aquella barrera de selva que impide una comunicación directa entre las dos mitades del continente americano, dotándolo también de una infraestructura industrial y de transporte. Este artículo quiere proponer una lectura alternativa frente a las razones que históricamente han justificado tales planes de conexión, mostrando que no corresponden con la voluntad de dotar a la población local con un sistema de transporte popular masivo y comercial, sino que hay que interpretarlos como una respuesta a meras exigencias de seguridad interna y continental.

Déjenme preguntarles algo... ¿Por qué en Colombia, desde hace cinco siglos, el ciclo de la violencia nunca se ha interrumpido? ¿Por qué la injusticia se ha apoderado de esas tierras y el olvido de esas almas? ¿Por qué Colombia y no Suiza o Canadá? Y perdónenme por esas preguntas probablemente banales y poco originales: es que soy italiano y mi tierra parece padecer de la misma enfermedad. Así que vamos a ver si, con una reflexión mancomunada y lo suficientemente crítica, podemos llegar a proponer unas respuestas concretas a nuestras inquietudes existenciales.

This article aims to show how those who have addressed the design of any mega project in Urabá (Colombia), have shown a pathological lack of all those mechanisms that substantial integration, when taking any decision with territorial implications, must take into account the actual participation of the inhabitants of those areas where you want to develop infrastructure plans, admitting the possibility that these people can give the negative and reject projects that not considered in line with its own concept of development.

notice biographique sur une monnaie de plume.

À partir d‟une recherche réalisée en Australie et en France et de situations de controverses rencontrées lors de l‟enquête de terrain, ce texte discute des pratiques du marché de l‟art aborigène et du cadre législatif les régulant.

Focusing on the Indigenous Australian Art Commercial Code of Conduct funded initially by the Australia Council and directed by several peak bodies for the visual arts and Indigenous arts - an initiative that has emerged from the Senate Inquiry into Australia's Indigenous visual arts sector, 2006 - I analyse the way French art dealers specialized in Aboriginal arts discuss and locate value differences within or between the economic and cultural domains. The presentation of this ethnographic data will contribute to the discussion on how procedures can be designed to discuss and respect value differences.

Stability theory based on a variational principle and finite-time direct-adjoint optimization commonly relies on the kinetic perturbation energy density E-1(t ) = (1/V-Omega) integral(Omega) e(x, t) d Omega (where e(x, t) = vertical bar u vertical bar(2)/2) as a measure of disturbance size. This type of optimization typically yields optimal perturbations that are global in the fluid domain Omega of volume V-Omega. This paper explores the use of p-norms in determining optimal perturbations for 'energy' growth over prescribed time intervals of length T. For p = 1 the traditional energy-based stability analysis is recovered, while for large p >> 1, localization of the optimal perturbations is observed which identifies confined regions, or 'hotspots', in the domain where significant energy growth can be expected. In addition, the p-norm optimization yields insight into the role and significance of various regions of the flow regarding the overall energy dynamics. As a canonical example, we choose to solve the infinity-norm optimal perturbation problem for the simple case of two-dimensional channel flow. For such a configuration, several solutions branches emerge, each of them identifying a different energy production zone in the flow: either the centre or the walls of the domain. We study several scenarios (involving centre or wall perturbations) leading to localized energy production for different optimization time intervals. Our investigation reveals that even for this simple two-dimensional channel flow, the mechanism for the production of a highly energetic and localized perturbation is not unique in time. We show that wall perturbations are optimal (with respect to the infinity-norm) for relatively short and long times, while the centre perturbations are preferred for very short and intermediate times. The developed p-norm framework is intended to facilitate worst-case analysis of shear flows and to identify localized regions supporting dominant energy growth.

This research has been conducted in the context of the ArtiMuse project that aims at the modeling and renewal of rare gestural knowledge and skills involved in the traditional craftsmanship and more precisely in the art of the wheel-throwing pottery. These knowledge and skills constitute the Intangible Cultural Heritage and refer to the fruit of diverse expertise founded and propagated over the centuries thanks to the ingeniousness of the gesture and the creativity of the human spirit. Nowadays, this expertise is very often threatened with disappearance because of the difficulty to resist to globalization and the fact that most of those "expertise holders" are not easily accessible due to geographical or other constraints. In this paper, a methodological framework for capturing and modeling gestural knowledge and skills in wheel-throwing pottery is proposed. It is based on capturing gestures using wireless inertial sensors and statistical modeling. In particular, we used a system that allows for online alignment of gestures using a modified Hidden Markov Model. This methodology is implemented into a Human-Computer Interface, which permits both the modeling and recognition of expert technical gestures. This system could be used to assist in the learning of these gestures by giving continuous feedback in real-time by measuring the difference between expert and learner gestures. The system has been tested and evaluated on different potters with a rare expertise, which is strongly related to their local identity.

While methods based on functional approaches for uncertainty quantification in physical models have reached maturity, multiscale stochastic models have recently been the focus of new numerical developments. Here we specifically take an interest in multiscale problems with numerous localized uncertainties at a micro level that can be associated with some variability in the operator or source terms, or even with some geometrical uncertainty. In order to handle the high dimensionality and the complexity that issue from such problems, a multiscale method based on patches has emerged as a relevant candidate for exploiting the localized side of uncertainties and has been extended to the stochastic framework in [1]. It proposes an efficient iterative global-local algorithm where the global problems at the macro level are made simple by introducing a fictitious patch that enables to define the (possibly coarse) global problem on a domain that contains no small scale geometrical details and that involves a deterministic operator. At the micro level, specific reformulations of local problems using fictitious domain methods [2] are introduced when the patch contains internal boundaries in order to formulate the local problem on a tensor product space. The global and local problems are solved using tensor based approximation methods [3] that allow the representation of high dimensional stochastic parametric solutions and at the same time make the stochastic methods non intrusive. In the present work, the approach is extended to problems with local non-linearities within the patches for which convergence properties are shown. We will also consider patches with variable positions which involve non conforming interfaces and for which rise questions of stability of approximation and optimal convergence with respect to the mesh.

na

The novelist Henry James shared with his brother William, the author of the Principles of Psychology, a deep interest in the ways in which personal identity is built through one's history and experiences. At the end of his life, Henry James suffered a vascular stroke in the right hemisphere and developed a striking identity delusion. He dictated in a perfectly clear and coherent manner two letters as if they were written by Napoleon Bonaparte. He also showed signs of reduplicative paramnesia. Negative symptoms resulting from right hemisphere damage may disrupt the feelings of "warmth and intimacy and immediacy" and the "resemblance among the parts of a continuum of feelings (especially bodily feelings)", which are the foundation of personal identity according to William James. On the other hand, a left hemisphere receiving inadequate input from the damaged right hemisphere may produce positive symptoms such as delusional, confabulatory narratives. Other fragments dictated during Henry James's final disease reveal some form of insight, if partial and disintegrated, into his condition. Thus, even when consciousness is impaired by brain damage, something of its deep nature may persist, as attested by the literary characteristics of the last fragments of the Master.

The aim of the present study was to evaluate the performance characteristics of all the ventilators proposed for home noninvasive positive-pressure ventilation in children in France. The ventilators (one volume-targeted, 12 pressure-targeted and four dual) were evaluated on a bench which simulated six different paediatric ventilatory patterns. For each ventilator, the quality of the inspiratory and expiratory trigger and the ability to reach and maintain the preset pressures and volumes were evaluated with the six patient profiles. The performance of the ventilators showed great variability, and depended upon the type of trigger (flow or pressure), type of circuit and patient profile. Differences were observed between the preset and measured airway pressure and between the tidal volume measured by the ventilator and on the bench. Leaks were associated with an inability to detect the patient's inspiratory effort or autotriggering. No single ventilator was able to adequately ventilate the six paediatric profiles. Only a few ventilators were able to ventilate the profiles simulating the youngest patients. A systematic paediatric bench evaluation is recommended for every ventilator proposed for home ventilation, in order to detect any dysfunction and guide the choice of the appropriate ventilator for a specific patient.

This research has been conducted in the context of the ArtiMuse project that aims at the modeling and renewal of rare gestural knowledge and skills involved in the traditional craftsmanship and more precisely in the art of the wheel-throwing pottery. These knowledge and skills constitute the Intangible Cultural Heritage and refer to the fruit of diverse expertise founded and propagated over the centuries thanks to the ingeniousness of the gesture and the creativity of the human spirit. Nowadays, this expertise is very often threatened with disappearance because of the difficulty to resist to globalization and the fact that most of those "expertise holders" are not easily accessible due to geographical or other constraints. In this paper, a methodological framework for capturing and modeling gestural knowledge and skills in wheel-throwing pottery is proposed. It is based on capturing gestures using wireless inertial sensors and statistical modeling. In particular, we used a system that allows for online alignment of gestures using a modified Hidden Markov Model. This methodology is implemented into a Human-Computer Interface, which permits both the modeling and recognition of expert technical gestures. This system could be used to assist in the learning of these gestures by giving continuous feedback in real-time by measuring the difference between expert and learner gestures. The system has been tested and evaluated on different potters with a rare expertise, which is strongly related to their local identity.

The task of estimating a matrix given a sample of observed entries is known as the \emph{matrix completion problem}. Most works on matrix completion have focused on recovering an unknown real-valued low-rank matrix from a random sample of its entries. Here, we investigate the case of highly quantized observations when the measurements can take only a small number of values. These quantized outputs are generated according to a probability distribution parametrized by the unknown matrix of interest. This model corresponds, for example, to ratings in recommender systems or labels in multi-class classification. We consider a general, non-uniform, sampling scheme and give theoretical guarantees on the performance of a constrained, nuclear norm penalized maximum likelihood estimator. One important advantage of this estimator is that it does not require knowledge of the rank or an upper bound on the nuclear norm of the unknown matrix and, thus, it is adaptive. We provide lower bounds showing that our estimator is minimax optimal. An efficient algorithm based on lifted coordinate gradient descent is proposed to compute the estimator. A limited Monte-Carlo experiment, using both simulated and real data is provided to support our claims.

Aircraft engine manufacturers collect large amount of engine related data during flights. These data are used to detect anomalies in the engines in order to help companies optimize their maintenance costs. This article introduces and studies a generic methodology that allows one to build automatic early signs of anomaly detection in a way that is understandable by human operators who make the final maintenance decision. The main idea of the method is to generate a very large number of binary indicators based on parametric anomaly scores designed by experts, complemented by simple aggregations of those scores. The best indicators are selected via a classical forward scheme, leading to a much reduced number of indicators that are tuned to a data set. We illustrate the interest of the method on simulated data which contain realistic early signs of anomalies.

À l'attention du déposant
Le dépôt doit être effectué en accord avec les co-auteurs et dans le respect de la politique des éditeurs
La mise en ligne est assujettie à une modération, la direction de HAL se réservant le droit de refuser les articles ne correspondant pas aux critères de l'archive (voir le guide du déposant )
Tout dépôt est définitif, aucun retrait ne sera effectué après la mise en ligne de l'article
Consulter le ManuHAL
Les fichiers textes au format pdf ou les fichiers images composant votre dépôt sont maintenant envoyés au CINES dans un contexte d'archivage à long terme.
À l'attention des lecteurs
Dans un contexte de diffusion électronique, tout auteur conserve ses droits intellectuels, notamment le fait de devoir être correctement cité et reconnu comme l'auteur d'un document.
Conditions d'utilisation
Les métadonnées de HAL peuvent être consultées de façon totale ou partielle par moissonnage OAI-PMH dans le respect du code de la propriété intellectuelle ;
Pas d'utilisation commerciale des données extraites ;
Obligation de citer la source (exemple : hal.archives-ouvertes.fr/hal-00000001).
Déposer

Les services support et modération fonctionneront de manière réduite pendant les vacances scolaires