We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This chapter covers applications of quantum computing in the area of nuclear and particle physics. We cover algorithms for simulating quantum field theories, where end-to-end problems include computing fundamental physical quantities and scattering cross sections. We also discuss simulations of nuclear physics, which encompasses individual nuclei as well as dense nucleonic matter such as neutron stars.
This chapter covers applications of quantum computing in the area of quantum chemistry, where the goal is to predict the physical properties and behaviors of atoms, molecules, and materials. We discuss algorithms for simulating electrons in molecules and materials, including both static properties such as ground state energies and dynamic properties. We also discuss algorithms for simulating static and dynamic aspects of vibrations in molecules and materials.
To enhance radiological and nuclear emergency preparedness of hospitals while responding to the refugee crisis, the Government of the Republic of Moldova implemented an innovative approach supported by the World Health Organization (WHO). This initiative featured a comprehensive package that integrated health system assessment, analysis of existing plans and procedures, and novel medical training component. The training, based on relevant WHO and International Atomic Energy Agency (IAEA) guidance, combined theory with contemporary adult learning solutions, such as practical skill stations, case reviews, and clinical simulation exercises.
This method allowed participants to identify and address gaps in their emergency response capacities, enhancing their ability to ensure medical management of radiological and nuclear events. This course is both innovative and adaptable, offering a potential model for other countries seeking to strengthen radiological and nuclear emergency response capabilities of the acute care clinical providers.
Due to the scarcity of data, the demographic regime of pre-plague England is poorly understood. In this article, we review the existing literature to estimate the mean age at first marriage for women (at 24) and men (at 27), the remaining life expectancy at first marriage for men (at 25 years), the mean household size (at 5.8), and marital fertility around 1300. Based on these values, we develop a macrosimulation that creates a consistent image of English demography at its medieval population peak that reflects a Western European marriage pattern with a relatively very high share of celibates.
This chapter surveys some of the many types of models used in science, and some of the many ways scientists use models. Of particular interest for our purposes are the relationships between models and other aspects of scientific inquiry, such as data, experiments, and theories. Our discussion shows important ways in which modeling can be thought of as a distinct and autonomous scientific activity, but always models can be crucial for making use of data and theories and for performing experiments. The growing reliance on simulation models has raised new and important questions about the kind of knowledge gained by simulations and the relationship between simulation and experimentation. Is it important to distinguish between simulation and experimentation, and if so, why?
Mental imagery can be used to simulate imminent, distant possible, or even impossible futures. Such mental simulation enables people to explore the consequences of different actions they want to perform or the consequences of being in different kinds of situations. Predictive simulation retrieves embodied knowledge but also creates new knowledge because people can compare different simulated scenarios and draw conclusions from that.
This report describes the implementation and evaluation of a unique escape room game/unfolding public health preparedness simulation into nursing education. The innovative approach was designed to teach disease investigation, epidemiological principles, and technical skills such as the tuberculosis (TB) skin testing techniques.
Methods
The escape room/unfolding health preparedness simulation was implemented with 29 pre-licensure nursing students and involved game-like activities as well as a realistic disaster simulation scenario with standardized patients.
Results
The project yielded positive outcomes, with students demonstrating increased knowledge and confidence. Students also recommended the simulation for teaching disaster preparedness, highlighting its effectiveness. Evaluation data also suggested refinement of the simulation around the nurses’ roles.
Conclusions
While implementing this teaching innovation had challenges, the approach enhanced active learning, critical thinking, and teamwork in nursing education, preparing students for real-world health care challenges. The project underscores the importance of such simulations in training nursing students for public health emergencies. It also highlights the need for further research to assess long-term impacts on student outcomes, indicating the potential for continued improvement and development in the field.
The chapter outlines key principles in Cognitive CDA, which inherits its social theory from CDA and from cognitive linguistics inherits a particular view of language and a framework for analysing language (as well as other semiotic modes). In connection with CDA, the chapter describes the dialectical relationship conceived between discourse and society. Key concepts relating to the dialogicality of discourse are also introduced, namely intertextuality and interdiscursivity. The central role of discourse in maintaining power and inequality is described with a focus on the ideological and legitimating functions of language and conceptualisation. In connection with cognitive linguistics, the chapter describes the non-autonomous nature of language, the continuity between grammar and the lexicon and the experiential grounding of language. The key concept of construal and its implications for ideology in language and conceptualisation are discussed. A framework in which construal operations are related to discursive strategies and domain-general cognitive systems and processes is set out. The chapter closes by briefly introducing the main models and methods of Cognitive CDA.
Test educational interventions to increase the quality of care in telemedicine.
Background:
Telemedicine (TM) has become an essential tool to practise medicine around the world. However, education to address clinical skills in TM remains an area of need globally across the health professions. We aim to evaluate the impact of a pilot online learning platform (OLP) and standardized coaching programme on the quality of medical student TM clinical skills.
Methods:
A randomized pilot study was conducted with fourth-year medical students (n = 12). All participants engaged in video-recorded standardized patient (SP) simulated encounters to assess TM clinical skills before and after the intervention. Participants were randomized to either the OLP or OLP + Virtual Coaching Institute (VCI) intervention cohort. Quantitative and qualitative data were collected to address self-reported skills, attitudes, and self-efficacy before the 1st SP encounter and after the 2nd SP encounter. SP encounter recordings were scored by two blinded non-investigator raters based on a standardized rubric to measure the change in TM care delivered pre- and post-intervention. Statistical analysis of quantitative data included descriptive statistics and mixed effects ANOVA.
Findings:
Recruitment and retention of participants exceeded expectations, pointing to significant enthusiasm for this educational opportunity. Self-reported skills and scored simulation skills demonstrated significant improvements for all participants receiving the interventions. Both OLP and VCI interventions were well received, feasible, and demonstrated statistically significant efficacy in improving TM clinical skills. Participants who received coaching described more improvements in self-efficacy, confidence, and overall virtual clinical skills. This study provides evidence that virtualized clinical learning environments can positively impact the development of TM clinical skills among medical students. As TM continues to evolve, the implementation of innovative training approaches will be crucial in preparing the next generation of healthcare professionals for the demands of modern healthcare delivery.
This paper proposes an ordinal generalization of the hierarchical classes model originally proposed by De Boeck and Rosenberg (1998). Any hierarchical classes model implies a decomposition of a two-way two-mode binary array M into two component matrices, called bundle matrices, which represent the association relation and the set-theoretical relations among the elements of both modes in M. Whereas the original model restricts the bundle matrices to be binary, the ordinal hierarchical classes model assumes that the bundles are ordinal variables with a prespecified number of values. This generalization results in a classification model with classes ordered along ordinal dimensions. The ordinal hierarchical classes model is shown to subsume Coombs and Kao's (1955) model for nonmetric factor analysis. An algorithm is described to fit the model to a given data set and is subsequently evaluated in an extensive simulation study. An application of the model to student housing data is discussed.
The Vale–Maurelli (VM) approach to generating non-normal multivariate data involves the use of Fleishman polynomials applied to an underlying Gaussian random vector. This method has been extensively used in Monte Carlo studies during the last three decades to investigate the finite-sample performance of estimators under non-Gaussian conditions. The validity of conclusions drawn from these studies clearly depends on the range of distributions obtainable with the VM method. We deduce the distribution and the copula for a vector generated by a generalized VM transformation, and show that it is fundamentally linked to the underlying Gaussian distribution and copula. In the process we derive the distribution of the Fleishman polynomial in full generality. While data generated with the VM approach appears to be highly non-normal, its truly multivariate properties are close to the Gaussian case. A Monte Carlo study illustrates that generating data with a different copula than that implied by the VM approach severely weakens the performance of normal-theory based ML estimates.
Six different algorithms to generate widely different non-normal distributions are reviewed. These algorithms are compared in terms of speed, simplicity and generality of the technique. The advantages and disadvantages of using these algorithms are briefly discussed.
The use of p-values in combining the results of independent studies often involves studies that are potentially aberrant either in quality or in actual values. A robust data analysis suggests the use of a statistic that takes these aberrations into account by trimming some of the largest and smallest p-values. We present a trimmed statistic based on an inverse cumulative normal transformation of the ordered p-values, together with a simple and convenient method for approximating the distribution and first two moments of this statistic.
We give an account of Classical Test Theory (CTT) in terms of the more fundamental ideas of Item Response Theory (IRT). This approach views classical test theory as a very general version of IRT, and the commonly used IRT models as detailed elaborations of CTT for special purposes. We then use this approach to CTT to derive some general results regarding the prediction of the true-score of a test from an observed score on that test as well from an observed score on a different test. This leads us to a new view of linking tests that were not developed to be linked to each other. In addition we propose true-score prediction analogues of the Dorans and Holland measures of the population sensitivity of test linking functions. We illustrate the accuracy of the first-order theory using simulated data from the Rasch model, and illustrate the effect of population differences using a set of real data.
This paper presents an analysis, based on simulation, of the stability of principal components. Stability is measured by the expectation of the absolute inner product of the sample principal component with the corresponding population component. A multiple regression model to predict stability is devised, calibrated, and tested using simulated Normal data. Results show that the model can provide useful predictions of individual principal component stability when working with correlation matrices. Further, the predictive validity of the model is tested against data simulated from three non-Normal distributions. The model predicted very well even when the data departed from normality, thus giving robustness to the proposed measure. Used in conjunction with other existing rules this measure will help the user in determining interpretability of principal components.
The jackknife by groups and modifications of the jackknife by groups are used to estimate standard errors of rotated factor loadings for selected populations in common factor model maximum likelihood factor analysis. Simulations are performed in which t-statistics based upon these jackknife estimates of the standard errors are computed. The validity of the t-statistics and their associated confidence intervals is assessed. Methods are given through which the computational efficiency of the jackknife may be greatly enhanced in the factor analysis model.
A standard approach for handling ordinal data in covariance analysis such as structural equation modeling is to assume that the data were produced by discretizing a multivariate normal vector. Recently, concern has been raised that this approach may be less robust to violation of the normality assumption than previously reported. We propose a new perspective for studying the robustness toward distributional misspecification in ordinal models using a class of non-normal ordinal covariance models. We show how to simulate data from such models, and our simulation results indicate that standard methodology is sensitive to violation of normality. This emphasizes the importance of testing distributional assumptions in empirical studies. We include simulation results on the performance of such tests.
An approach to generate non-normality in multivariate data based on a structural model with normally distributed latent variables is presented. The key idea is to create non-normality in the manifest variables by applying non-linear linking functions to the latent part, the error part, or both. The algorithm corrects the covariance matrix for the applied function by approximating the deviance using an approximated normal variable. We show that the root mean square error (RMSE) for the covariance matrix converges to zero as sample size increases and closely approximates the RMSE as obtained when generating normally distributed variables. Our algorithm creates non-normality affecting every moment, is computationally undemanding, easy to apply, and particularly useful for simulation studies in structural equation modeling.
This paper considers a multivariate normal model with one of the component variables observable only in polytomous form. The maximum likelihood approach is used for estimation of the parameters in the model. The Newton-Raphson algorithm is implemented to obtain the solution of the problem. Examples based on real and simulated data are reported.
The Non-Equivalent groups with Anchor Test (NEAT) design involves missingdata that are missing by design. Three nonlinear observed score equating methods used with a NEAT design are the frequency estimation equipercentile equating (FEEE), the chain equipercentile equating (CEE), and the item-response-theory observed-score-equating (IRT OSE). These three methods each make different assumptions about the missing data in the NEAT design. The FEEE method assumes that the conditional distribution of the test score given the anchor test score is the same in the two examinee groups. The CEE method assumes that the equipercentile functions equating the test score to the anchor test score are the same in the two examinee groups. The IRT OSE method assumes that the IRT model employed fits the data adequately, and the items in the tests and the anchor test do not exhibit differential item functioning across the two examinee groups. This paper first describes the missing data assumptions of the three equating methods. Then it describes how the missing data in the NEAT design can be filled in a manner that is coherent with the assumptions made by each of these equating methods. Implications on equating are also discussed.