Results for ' data models'

987 found
Order:
  1. Data models and the acquisition and manipulation of data.Todd Harris - 2003 - Philosophy of Science 70 (5):1508-1517.
    This paper offers an account of data manipulation in scientific experiments. It will be shown that in many cases raw, unprocessed data is not produced, but rather a form of processed data that will be referred to as a data model. The language of data models will be used to provide a framework within which to understand a recent debate about the status of data and data manipulation. It will be seen that (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   29 citations  
  2.  23
    The role data model revisited.Friedrich Steimann - 2007 - Applied ontology 2 (2):89-103.
    While Bachman's role data model is often cited, it appears that its contribution, the introduction of role types and their conception as unions of entity types that occupy the places of relationship types, has mostly been ignored. This is unfortunate since it has led to countless reinventions of the wheel, and sometimes even to regress. With this homage, the author wishes to shed some light on the natural elegance of Bachman's role concept, and to make clear why he believes (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  3. Data models, representation and adequacy-for-purpose.Alisa Bokulich & Wendy Parker - 2021 - European Journal for Philosophy of Science 11 (1):1-26.
    We critically engage two traditional views of scientific data and outline a novel philosophical view that we call the pragmatic-representational view of data. On the PR view, data are representations that are the product of a process of inquiry, and they should be evaluated in terms of their adequacy or fitness for particular purposes. Some important implications of the PR view for data assessment, related to misrepresentation, context-sensitivity, and complementary use, are highlighted. The PR view provides (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   19 citations  
  4.  62
    What is a data model?: An anatomy of data analysis in high energy physics.Antonis Antoniou - 2021 - European Journal for Philosophy of Science 11 (4):1-33.
    Many decades ago Patrick Suppes argued rather convincingly that theoretical hypotheses are not confronted with the direct, raw results of an experiment, rather, they are typically compared with models of data. What exactly is a data model however? And how do the interactions of particles at the subatomic scale give rise to the huge volumes of data that are then moulded into a polished data model? The aim of this paper is to answer these questions (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  5.  19
    Data, Models and Earth History in Deep Convolution: Paleoclimate Simulations and their Epistemological Unrest.Christoph Rosol - 2017 - Berichte Zur Wissenschaftsgeschichte 40 (2):120-139.
    Translation abstractZusammenfassung: Daten, Modelle und Erdgeschichte ineinander gefaltet: Paläo‐ Simulationen und ihre epistemologische Unruhe. Klima‐ und Erdsystemmodelle werden nicht nur verwendet, um künftige klimatische Bedingungen zu prognostizieren, sondern auch, um vergangene Klima‐Ereignisse zu rekonstruieren. Dieser Beitrag ist der zweite in einer Reihe, welche die Paläoklimatologie – die Wissenschaft der Klimate vor Anbeginn direkter, instrumentenbasierter Messungen – als eine epistemisch radikale Praxis vorstellt, die in direkter und offener Weise die Unterscheidung zwischen Daten und Modell aufhebt sowie den Begriff des Experiments rekonfiguriert (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  6.  6
    Rough Set Approach toward Data Modelling and User Knowledge for Extracting Insights.Xiaoqun Liao, Shah Nazir, Junxin Shen, Bingliang Shen & Sulaiman Khan - 2021 - Complexity 2021:1-9.
    Information is considered to be the major part of an organization. With the enhancement of technology, the knowledge level is increasing with the passage of time. This increase of information is in volume, velocity, and variety. Extracting meaningful insights is the dire need of an individual from such information and knowledge. Visualization is a key tool and has become one of the most significant platforms for interpreting, extracting, and communicating information. The current study is an endeavour toward data modelling (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  7.  26
    User Knowledge, Data Modelling, and Visualization: Handling through the Fuzzy Logic-Based Approach.Xiaoqun Liao, Shah Nazir, Yangbin Zhou, Muhammad Shafiq & Xuelin Qi - 2021 - Complexity 2021:1-14.
    In modern day technology, the level of knowledge is increasing day by day. This increase is in terms of volume, velocity, and variety. Understanding of such knowledge is a dire need of an individual to extract meaningful insight from it. With the advancement in computer and image-based technologies, visualization becomes one of the most significant platforms to extract, interpret, and communicate information. In data modelling, visualization is the process of extracting knowledge to reveal the detail data structure and (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  8.  47
    Evidence, Explanation and Predictive Data Modelling.Steve T. Mckinlay - 2017 - Philosophy and Technology 30 (4):461-473.
    Predictive risk modelling is a computational method used to generate probabilities correlating events. The output of such systems is typically represented by a statistical score derived from various related and often arbitrary datasets. In many cases, the information generated by such systems is treated as a form of evidence to justify further action. This paper examines the nature of the information generated by such systems and compares it with more orthodox notions of evidence found in epistemology. The paper focuses on (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  9.  50
    Tractable inference for probabilistic data models.Lehel Csato, Manfred Opper & Ole Winther - 2003 - Complexity 8 (4):64-68.
  10.  34
    From Data to Causes III: Bayesian Priors for General Cross-Lagged Panel Models (GCLM).Michael J. Zyphur, Ellen L. Hamaker, Louis Tay, Manuel Voelkle, Kristopher J. Preacher, Zhen Zhang, Paul D. Allison, Dean C. Pierides, Peter Koval & Edward F. Diener - 2021 - Frontiers in Psychology 12:612251.
    This article describes some potential uses of Bayesian estimation for time-series and panel data models by incorporating information from prior probabilities (i.e., priors) in addition to observed data. Drawing on econometrics and other literatures we illustrate the use of informative “shrinkage” or “small variance” priors (including so-called “Minnesota priors”) while extending prior work on the general cross-lagged panel model (GCLM). Using a panel dataset of national income and subjective well-being (SWB) we describe three key benefits of these (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  11. Using models to correct data: paleodiversity and the fossil record.Alisa Bokulich - 2018 - Synthese 198 (Suppl 24):5919-5940.
    Despite an enormous philosophical literature on models in science, surprisingly little has been written about data models and how they are constructed. In this paper, I examine the case of how paleodiversity data models are constructed from the fossil data. In particular, I show how paleontologists are using various model-based techniques to correct the data. Drawing on this research, I argue for the following related theses: first, the ‘purity’ of a data model (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   29 citations  
  12. Model selection and the multiplicity of patterns in empirical data.James W. McAllister - 2007 - Philosophy of Science 74 (5):884-894.
    Several quantitative techniques for choosing among data models are available. Among these are techniques based on algorithmic information theory, minimum description length theory, and the Akaike information criterion. All these techniques are designed to identify a single model of a data set as being the closest to the truth. I argue, using examples, that many data sets in science show multiple patterns, providing evidence for multiple phenomena. For any such data set, there is more than (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  13.  85
    Modeling Cultural Idea Systems: The Relationship between Theory Models and Data Models.Dwight Read - 2013 - Perspectives on Science 21 (2):157-174.
    Subjective experience is transformed into objective reality for societal members through cultural idea systems that can be represented with theory and data models. A theory model shows relationships and their logical implications that structure a cultural idea system. A data model expresses patterning found in ethnographic observations regarding the behavioral implementation of cultural idea systems. An example of this duality for modeling cultural idea systems is illustrated with Arabic proverbs that structurally link friend and enemy as concepts (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  14. Workshop on Web-Based Massive Data Processing-Session 1-Streaming Data-Modelling and Guaranteeing Quality of Service over Data Streams.Shanshan Gu Wu & Yanfei Yu Lv - 2006 - In O. Stock & M. Schaerf (eds.), Lecture Notes In Computer Science. Springer Verlag. pp. 13-24.
    No categories
     
    Export citation  
     
    Bookmark  
  15. Data and phenomena in conceptual modelling.Benedikt Löwe & Thomas Müller - 2011 - Synthese 182 (1):131-148.
    The distinction between data and phenomena introduced by Bogen and Woodward (Philosophical Review 97(3):303–352, 1988) was meant to help accounting for scientific practice, especially in relation with scientific theory testing. Their article and the subsequent discussion is primarily viewed as internal to philosophy of science. We shall argue that the data/phenomena distinction can be used much more broadly in modelling processes in philosophy.
    Direct download (11 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  16.  44
    Emerging models of data governance in the age of datafication.Anna Berti Suman, Max Craglia, Marisa Ponti & Marina Micheli - 2020 - Big Data and Society 7 (2).
    The article examines four models of data governance emerging in the current platform society. While major attention is currently given to the dominant model of corporate platforms collecting and economically exploiting massive amounts of personal data, other actors, such as small businesses, public bodies and civic society, take also part in data governance. The article sheds light on four models emerging from the practices of these actors: data sharing pools, data cooperatives, public (...) trusts and personal data sovereignty. We propose a social science-informed conceptualisation of data governance. Drawing from the notion of data infrastructure we identify the models as a function of the stakeholders’ roles, their interrelationships, articulations of value, and governance principles. Addressing the politics of data, we considered the actors’ competitive struggles for governing data. This conceptualisation brings to the forefront the power relations and multifaceted economic and social interactions within data governance models emerging in an environment mainly dominated by corporate actors. These models highlight that civic society and public bodies are key actors for democratising data governance and redistributing value produced through data. Through the discussion of the models, their underpinning principles and limitations, the article wishes to inform future investigations of socio-technical imaginaries for the governance of data, particularly now that the policy debate around data governance is very active in Europe. (shrink)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   4 citations  
  17. Graphs in Linguistics: Diagrammatic Features and Data Models.Paolo Petricca - 2019 - In Matthieu Fontaine, Cristina Barés-Gómez, Francisco Salguero-Lamillar, Lorenzo Magnani & Ángel Nepomuceno-Fernández (eds.), Model-Based Reasoning in Science and Technology: Inferential Models for Logic, Language, Cognition and Computation. Springer Verlag.
    No categories
     
    Export citation  
     
    Bookmark  
  18.  39
    Data-Driven Model-Free Adaptive Control of Particle Quality in Drug Development Phase of Spray Fluidized-Bed Granulation Process.Zhengsong Wang, Dakuo He, Xu Zhu, Jiahuan Luo, Yu Liang & Xu Wang - 2017 - Complexity:1-17.
    A novel data-driven model-free adaptive control approach is first proposed by combining the advantages of model-free adaptive control and data-driven optimal iterative learning control, and then its stability and convergence analysis is given to prove algorithm stability and asymptotical convergence of tracking error. Besides, the parameters of presented approach are adaptively adjusted with fuzzy logic to determine the occupied proportions of MFAC and DDOILC according to their different control performances in different control stages. Lastly, the proposed fuzzy DDMFAC (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  19.  57
    Modeling the Cardiovascular-Respiratory Control System: Data, Model Analysis, and Parameter Estimation.Jerry J. Batzel & Mostafa Bachar - 2010 - Acta Biotheoretica 58 (4):369-380.
    Several key areas in modeling the cardiovascular and respiratory control systems are reviewed and examples are given which reflect the research state of the art in these areas. Attention is given to the interrelated issues of data collection, experimental design, and model application including model development and analysis. Examples are given of current clinical problems which can be examined via modeling, and important issues related to model adaptation to the clinical setting.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  20.  28
    A Novel Efficiency Measure Model for Industrial Land Use Based on Subvector Data Envelope Analysis and Spatial Analysis Method.Wei Chen, Rui He & Qun Wu - 2017 - Complexity:1-11.
    With the rapid and unbalanced development of industry, a large amount of cultivated land is converted into industrial land with lower efficiency. The existing research is extensively concerned with industrial land use and industrial development in isolation, but little attention has been paid to the relationship between them. To help address this gap, the paper creates a new efficiency measure method for industrial land use combining Subvector Data Envelope Analysis with spatial analysis approach. The proposed model has been verified (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  21.  61
    Values and inductive risk in machine learning modelling: the case of binary classification models.Koray Karaca - 2021 - European Journal for Philosophy of Science 11 (4):1-27.
    I examine the construction and evaluation of machine learning binary classification models. These models are increasingly used for societal applications such as classifying patients into two categories according to the presence or absence of a certain disease like cancer and heart disease. I argue that the construction of ML classification models involves an optimisation process aiming at the minimization of the inductive risk associated with the intended uses of these models. I also argue that the construction (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  22.  21
    Big Data is not only about data: The two cultures of modelling.Giuseppe Alessandro Veltri - 2017 - Big Data and Society 4 (1).
    The contribution of Big Data to social science is not limited to data availability but includes the introduction of analytical approaches that have been developed in computer science, and in particular in machine learning. This brings about a new ‘culture’ of statistical modelling that bears considerable potential for the social scientist. This argument is illustrated with a brief discussion of model-based recursive partitioning which can bridge the theory and data-driven approach. Such a method is an example of (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  23.  78
    Models of data.Patrick Suppes - 2009 - In Ernest Nagel, Patrick Suppes & Alfred Tarski (eds.), Provability, Computability and Reflection. Stanford, CA, USA: Elsevier.
  24.  77
    Understanding climate phenomena with data-driven models.Benedikt Knüsel & Christoph Baumberger - 2020 - Studies in History and Philosophy of Science Part A 84 (C):46-56.
    In climate science, climate models are one of the main tools for understanding phenomena. Here, we develop a framework to assess the fitness of a climate model for providing understanding. The framework is based on three dimensions: representational accuracy, representational depth, and graspability. We show that this framework does justice to the intuition that classical process-based climate models give understanding of phenomena. While simple climate models are characterized by a larger graspability, state-of-the-art models have a higher (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  25.  9
    Model-Based Demography: Essays on Integrating Data, Technique and Theory.Thomas K. Burch - 2017 - Springer Verlag.
    Late in a career of more than sixty years, Thomas Burch, an internationally known social demographer, undertook a wide-ranging methodological critique of demography. This open access volume contains a selection of resulting papers, some previously unpublished, some published but not readily accessible [from past meetings of The International Union for the Scientific Study of Population and its research committees, or from other small conferences and seminars]. Rejecting the idea that demography is simply a branch of applied statistics, his work views (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  26.  11
    Satellite Data and Climate Models.Elisabeth A. Lloyd - 2018 - In Elisabeth A. Lloyd & Eric Winsberg (eds.), Climate Modelling: Philosophical and Conceptual Issues. Springer Verlag. pp. 65-71.
    In this brief chapter, Lloyd sets the stage for the following three papers, most centrally, Santer et al., which discusses whether the satellite data fit with climate models. Its target is a paper by Douglass et al., which claimed that satellite and weather balloon data showed that the climate models were wrong and could not be trusted. The Santer and Wigley “Fact Sheet” gives a nontechnical summary of what is wrong with the Douglass paper, while the (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  27.  10
    Data of Covid-19 Infection in Italy and Mathematical Models.Luigi Togliani - 2020 - Science and Philosophy 8 (2):165-180.
    In this paper I consider some data of Covid-19 infection in Italy from the 20th of February to the 29th of June 2020. Data are analyzed using some fits based on mathematical models. This analysis may be proposed to students of the last class of the Liceo Scientifico in order to debate a real problem with mathematical tools.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  28.  34
    Local Model-Data Symbiosis in Meteorology and Climate Science.Wendy Parker - 2020 - Philosophy of Science 87 (5):807-818.
    I introduce a distinction between general and local model-data symbiosis and offer three examples of local symbiosis in the fields of meteorology and climate science. Local model-data symbiosis ref...
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  29. When mechanistic models explain.Carl F. Craver - 2006 - Synthese 153 (3):355-376.
    Not all models are explanatory. Some models are data summaries. Some models sketch explanations but leave crucial details unspecified or hidden behind filler terms. Some models are used to conjecture a how-possibly explanation without regard to whether it is a how-actually explanation. I use the Hodgkin and Huxley model of the action potential to illustrate these ways that models can be useful without explaining. I then use the subsequent development of the explanation of the (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   259 citations  
  30.  39
    Essential and mandatory part-whole relations in conceptual data models.C. Maria Keet - unknown
    A recurring problem in conceptual modelling and ontology development is the representation of part-whole relations, with a requirement to be able to distinguish between essential and mandatory parts. To solve this problem, we formally characterize the semantics of these shareability notions by resorting to the temporal conceptual model E RVT and its formalization in the description logic DLRUS.
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  31.  68
    Assessing the Strengths and Weaknesses of Large Language Models.Shalom Lappin - 2023 - Journal of Logic, Language and Information 33 (1):9-20.
    The transformers that drive chatbots and other AI systems constitute large language models (LLMs). These are currently the focus of a lively discussion in both the scientific literature and the popular media. This discussion ranges from hyperbolic claims that attribute general intelligence and sentience to LLMs, to the skeptical view that these devices are no more than “stochastic parrots”. I present an overview of some of the weak arguments that have been presented against LLMs, and I consider several of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  32.  17
    The DASH model: Data for addressing social determinants of health in local health departments.Anna Petrovskis, Betty Bekemeier, Elizabeth Heitkemper & Jenna van Draanen - 2023 - Nursing Inquiry 30 (1):e12518.
    Recent frameworks, models, and reports highlight the critical need to address social determinants of health for achieving health equity in the United States and around the globe. In the United States, data play an important role in better understanding community‐level and population‐level disparities particularly for local health departments. However, data‐driven decision‐making—the use of data for public health activities such as program implementation, policy development, and resource allocation—is often presented theoretically or through case studies in the literature. (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  33.  11
    (1 other version)Social data governance: Towards a definition and model.Jun Liu - 2022 - Big Data and Society 9 (2).
    With the surge in the number of data and datafied governance initiatives, arrangements, and practices across the globe, understanding various types of such initiatives, arrangements, and their structural causes has become a daunting task for scholars, policy makers, and the public. This complexity additionally generates substantial difficulties in considering different data(fied) governances commensurable with each other. To advance the discussion, this study argues that existing scholarship is inclined to embrace an organization-centric perspective that primarily concerns factors and dynamics (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  34. Coherence and correspondence in the network dynamics of belief suites.Patrick Grim, Andrew Modell, Nicholas Breslin, Jasmine Mcnenny, Irina Mondescu, Kyle Finnegan, Robert Olsen, Chanyu An & Alexander Fedder - 2017 - Episteme 14 (2):233-253.
    Coherence and correspondence are classical contenders as theories of truth. In this paper we examine them instead as interacting factors in the dynamics of belief across epistemic networks. We construct an agent-based model of network contact in which agents are characterized not in terms of single beliefs but in terms of internal belief suites. Individuals update elements of their belief suites on input from other agents in order both to maximize internal belief coherence and to incorporate ‘trickled in’ elements of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  35.  94
    Biomedical Big Data: New Models of Control Over Access, Use and Governance.Alessandro Blasimme & Effy Vayena - 2017 - Journal of Bioethical Inquiry 14 (4):501-513.
    Empirical evidence suggests that while people hold the capacity to control their data in high regard, they increasingly experience a loss of control over their data in the online world. The capacity to exert control over the generation and flow of personal information is a fundamental premise to important values such as autonomy, privacy, and trust. In healthcare and clinical research this capacity is generally achieved indirectly, by agreeing to specific conditions of informational exposure. Such conditions can be (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   15 citations  
  36. Predicting Big Data Adoption in Companies With an Explanatory and Predictive Model.Ángel F. Villarejo-Ramos, Juan-Pedro Cabrera-Sánchez, Juan Lara-Rubio & Francisco Liébana-Cabanillas - 2021 - Frontiers in Psychology 12:651398.
    The purpose of this paper is to identify the factors that affect the intention to use Big Data Applications in companies. Research into Big Data usage intention and adoption is scarce and much less from the perspective of the use of these techniques in companies. That is why this research focuses on analyzing the adoption of Big Data Applications by companies. Further to a review of the literature, it is proposed to use a UTAUT model as a (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  37. Large Language Models and the Reverse Turing Test.Terrence Sejnowski - 2023 - Neural Computation 35 (3):309–342.
    Large Language Models (LLMs) have been transformative. They are pre-trained foundational models that are self-supervised and can be adapted with fine tuning to a wide range of natural language tasks, each of which previously would have required a separate network model. This is one step closer to the extraordinary versatility of human language. GPT-3 and more recently LaMDA can carry on dialogs with humans on many topics after minimal priming with a few examples. However, there has been a (...)
    Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  38.  61
    Truth machines: synthesizing veracity in AI language models.Luke Munn, Liam Magee & Vanicka Arora - 2024 - AI and Society 39 (6):2759-2773.
    As AI technologies are rolled out into healthcare, academia, human resources, law, and a multitude of other domains, they become de-facto arbiters of truth. But truth is highly contested, with many different definitions and approaches. This article discusses the struggle for truth in AI systems and the general responses to date. It then investigates the production of truth in InstructGPT, a large language model, highlighting how data harvesting, model architectures, and social feedback mechanisms weave together disparate understandings of veracity. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  39. Models and Data in Finance: les Liaisons Dangereuses.Emiliano Ippoliti - 2019 - In Matthieu Fontaine, Cristina Barés-Gómez, Francisco Salguero-Lamillar, Lorenzo Magnani & Ángel Nepomuceno-Fernández (eds.), Model-Based Reasoning in Science and Technology: Inferential Models for Logic, Language, Cognition and Computation. Springer Verlag.
    No categories
     
    Export citation  
     
    Bookmark   3 citations  
  40.  55
    Connectionist and Memory‐Array Models of Artificial Grammar Learning.Zoltan Dienes - 1992 - Cognitive Science 16 (1):41-79.
    Subjects exposed to strings of letters generated by a finite state grammar can later classify grammatical and nongrammatical test strings, even though they cannot adequately say what the rules of the grammar are (e.g., Reber, 1989). The MINERVA 2 (Hintzman, 1986) and Medin and Schaffer (1978) memory‐array models and a number of connectionist outoassociator models are tested against experimental data by deriving mainly parameter‐free predictions from the models of the rank order of classification difficulty of test (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   61 citations  
  41. Lessons from the Large Hadron Collider for model-based experimentation: the concept of a model of data acquisition and the scope of the hierarchy of models.Koray Karaca - 2018 - Synthese 195 (12):1-22.
    According to the hierarchy of models account of scientific experimentation developed by Patrick Suppes and elaborated by Deborah Mayo, theoretical considerations about the phenomena of interest are involved in an experiment through theoretical models that in turn relate to experimental data through data models, via the linkage of experimental models. In this paper, I dispute the HoM account in the context of present-day high-energy physics experiments. I argue that even though the HoM account aims (...)
    No categories
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   14 citations  
  42.  30
    Sources of Understanding in Supervised Machine Learning Models.Paulo Pirozelli - 2022 - Philosophy and Technology 35 (2):1-19.
    In the last decades, supervised machine learning has seen the widespread growth of highly complex, non-interpretable models, of which deep neural networks are the most typical representative. Due to their complexity, these models have showed an outstanding performance in a series of tasks, as in image recognition and machine translation. Recently, though, there has been an important discussion over whether those non-interpretable models are able to provide any sort of understanding whatsoever. For some scholars, only interpretable (...) can provide understanding. More popular, however, is the idea that understanding can come from a careful analysis of the dataset or from the model’s theoretical basis. In this paper, I wish to examine the possible forms of obtaining understanding of such non-interpretable models. Two main strategies for providing understanding are analyzed. The first involves understanding without interpretability, either through external evidence for the model’s inner functioning or through analyzing the data. The second is based on the artificial production of interpretable structures, through three main forms: post hoc models, hybrid models, and quasi-interpretable structures. Finally, I consider some of the conceptual difficulties in the attempt to create explanations for these models, and their implications for understanding. (shrink)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  43. Data Synthesis for Big Questions: From Animal Tracks to Ecological Models.Rose Trappes - 2024 - Philosophy, Theory, and Practice in Biology 16 (1):4.
    This paper addresses a relatively new mode of ecological research: data synthesis studies. Data synthesis studies involve reusing data to create a general model as well as a reusable, aggregated dataset. Using a case from movement ecology, I analyse the trade-offs and strategies involved in data synthesis. Like theoretical ecological modelling, I find that synthesis studies involve a modelling trade-off between generality, precision and realism; they deal with this trade-off by adopting a pragmatic kludging strategy. I (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  44.  4
    Prelog’s model as the first tool to predict stereoselectivity: identifying patterns in chemical data to construct models.Toratane Munegumi - forthcoming - Foundations of Chemistry:1-19.
    Prelog’s model was one of the first empirical models to explain the stereoselectivity of the Grignard reactions of 2-oxocarboxylic acid esters bearing a chiral alcohol. Prelog constructed his model based on some assumptions regarding the conformation of chiral 2-oxocarboxylic acid esters to explain the relationship in configuration between the chiral alcohol starting materials and the 2-hydroxycarboxylic acid products. Construction of the model involves four steps: (1) mentally analyzing the reactants to identify the basic stereochemical structures, (2) assuming the conformations (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  45. Fashioning descriptive models in biology: Of Worms and wiring diagrams.Rachel A. Ankeny - 2000 - Philosophy of Science 67 (3):272.
    The biological sciences have become increasingly reliant on so-called 'model organisms'. I argue that in this domain, the concept of a descriptive model is essential for understanding scientific practice. Using a case study, I show how such a model was formulated in a preexplanatory context for subsequent use as a prototype from which explanations ultimately may be generated both within the immediate domain of the original model and in additional, related domains. To develop this concept of a descriptive model, I (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   45 citations  
  46.  16
    Multilevel Models for Intensive Longitudinal Data with Heterogeneous Autoregressive Errors: The Effect of Misspecification and Correction with Cholesky Transformation.Seungmin Jahng & Phillip K. Wood - 2017 - Frontiers in Psychology 8.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  47.  14
    Integration of Heterogeneous Biological Data in Multiscale Mechanistic Model Calibration: Application to Lung Adenocarcinoma.Claudio Monteiro, Adèle L’Hostis, Jim Bosley, Ben M. W. Illigens, Eliott Tixier, Matthieu Coudron, Emmanuel Peyronnet, Nicoletta Ceres, Angélique Perrillat-Mercerot & Jean-Louis Palgen - 2022 - Acta Biotheoretica 70 (3):1-24.
    Mechanistic models are built using knowledge as the primary information source, with well-established biological and physical laws determining the causal relationships within the model. Once the causal structure of the model is determined, parameters must be defined in order to accurately reproduce relevant data. Determining parameters and their values is particularly challenging in the case of models of pathophysiology, for which data for calibration is sparse. Multiple data sources might be required, and data may (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  48. Fast and frugal versus regression models of human judgement.Mandeep K. Dhami Clare Harries - 2001 - Thinking and Reasoning 7 (1):5-27.
    Following Brunswik (1952), social judgement theorists have long relied on regression models to describe both an individual's judgements and the environment about which such judgements are made. However, social judgement theory is not synonymous with these compensatory, static, structural models. We compared the characterisations of physicians' judgements using a regression model with that of a non-compensatory process model (called fast and frugal). We found that both models fit the data equally well. Both models suggest that (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  49. What distinguishes data from models?Sabina Leonelli - 2019 - European Journal for Philosophy of Science 9 (2):22.
    I propose a framework that explicates and distinguishes the epistemic roles of data and models within empirical inquiry through consideration of their use in scientific practice. After arguing that Suppes’ characterization of data models falls short in this respect, I discuss a case of data processing within exploratory research in plant phenotyping and use it to highlight the difference between practices aimed to make data usable as evidence and practices aimed to use data (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   19 citations  
  50.  29
    Stellar Structure Models Revisited: Evidence and Data in Asteroseismology.Mauricio Suárez - 2023 - In Nora Mills Boyd, Siska De Baerdemaeker, Kevin Heng & Vera Matarese (eds.), Philosophy of Astrophysics: Stars, Simulations, and the Struggle to Determine What is Out There. Springer Verlag. pp. 2147483647-2147483647.
    This paper advances further an ongoing project to understand the history of stellar structure modelling and its inferential practice. It does so by taking a harder look at the data: how it is collected, analysed statistically, and represented in HR diagrams and stellar structure models alike. The focus is ultimately on the sorts of strong observational constraints revealed in the last two decades within the new and expanding field of asteroseismology. It is argued that the typical inferential practices (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
1 — 50 / 987