Results for 'data models'

975 found
Order:
  1. Data models and the acquisition and manipulation of data.Todd Harris - 2003 - Philosophy of Science 70 (5):1508-1517.
    This paper offers an account of data manipulation in scientific experiments. It will be shown that in many cases raw, unprocessed data is not produced, but rather a form of processed data that will be referred to as a data model. The language of data models will be used to provide a framework within which to understand a recent debate about the status of data and data manipulation. It will be seen that (...)
    Direct download (8 more)  
     
    Export citation  
     
    Bookmark   29 citations  
  2. Data models, representation and adequacy-for-purpose.Alisa Bokulich & Wendy Parker - 2021 - European Journal for Philosophy of Science 11 (1):1-26.
    We critically engage two traditional views of scientific data and outline a novel philosophical view that we call the pragmatic-representational view of data. On the PR view, data are representations that are the product of a process of inquiry, and they should be evaluated in terms of their adequacy or fitness for particular purposes. Some important implications of the PR view for data assessment, related to misrepresentation, context-sensitivity, and complementary use, are highlighted. The PR view provides (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   19 citations  
  3.  23
    The role data model revisited.Friedrich Steimann - 2007 - Applied ontology 2 (2):89-103.
    While Bachman's role data model is often cited, it appears that its contribution, the introduction of role types and their conception as unions of entity types that occupy the places of relationship types, has mostly been ignored. This is unfortunate since it has led to countless reinventions of the wheel, and sometimes even to regress. With this homage, the author wishes to shed some light on the natural elegance of Bachman's role concept, and to make clear why he believes (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  4.  19
    Data, Models and Earth History in Deep Convolution: Paleoclimate Simulations and their Epistemological Unrest.Christoph Rosol - 2017 - Berichte Zur Wissenschaftsgeschichte 40 (2):120-139.
    Translation abstractZusammenfassung: Daten, Modelle und Erdgeschichte ineinander gefaltet: Paläo‐ Simulationen und ihre epistemologische Unruhe. Klima‐ und Erdsystemmodelle werden nicht nur verwendet, um künftige klimatische Bedingungen zu prognostizieren, sondern auch, um vergangene Klima‐Ereignisse zu rekonstruieren. Dieser Beitrag ist der zweite in einer Reihe, welche die Paläoklimatologie – die Wissenschaft der Klimate vor Anbeginn direkter, instrumentenbasierter Messungen – als eine epistemisch radikale Praxis vorstellt, die in direkter und offener Weise die Unterscheidung zwischen Daten und Modell aufhebt sowie den Begriff des Experiments rekonfiguriert (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  5.  6
    Rough Set Approach toward Data Modelling and User Knowledge for Extracting Insights.Xiaoqun Liao, Shah Nazir, Junxin Shen, Bingliang Shen & Sulaiman Khan - 2021 - Complexity 2021:1-9.
    Information is considered to be the major part of an organization. With the enhancement of technology, the knowledge level is increasing with the passage of time. This increase of information is in volume, velocity, and variety. Extracting meaningful insights is the dire need of an individual from such information and knowledge. Visualization is a key tool and has become one of the most significant platforms for interpreting, extracting, and communicating information. The current study is an endeavour toward data modelling (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  6.  62
    What is a data model?: An anatomy of data analysis in high energy physics.Antonis Antoniou - 2021 - European Journal for Philosophy of Science 11 (4):1-33.
    Many decades ago Patrick Suppes argued rather convincingly that theoretical hypotheses are not confronted with the direct, raw results of an experiment, rather, they are typically compared with models of data. What exactly is a data model however? And how do the interactions of particles at the subatomic scale give rise to the huge volumes of data that are then moulded into a polished data model? The aim of this paper is to answer these questions (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  7.  47
    Evidence, Explanation and Predictive Data Modelling.Steve T. Mckinlay - 2017 - Philosophy and Technology 30 (4):461-473.
    Predictive risk modelling is a computational method used to generate probabilities correlating events. The output of such systems is typically represented by a statistical score derived from various related and often arbitrary datasets. In many cases, the information generated by such systems is treated as a form of evidence to justify further action. This paper examines the nature of the information generated by such systems and compares it with more orthodox notions of evidence found in epistemology. The paper focuses on (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  8.  26
    User Knowledge, Data Modelling, and Visualization: Handling through the Fuzzy Logic-Based Approach.Xiaoqun Liao, Shah Nazir, Yangbin Zhou, Muhammad Shafiq & Xuelin Qi - 2021 - Complexity 2021:1-14.
    In modern day technology, the level of knowledge is increasing day by day. This increase is in terms of volume, velocity, and variety. Understanding of such knowledge is a dire need of an individual to extract meaningful insight from it. With the advancement in computer and image-based technologies, visualization becomes one of the most significant platforms to extract, interpret, and communicate information. In data modelling, visualization is the process of extracting knowledge to reveal the detail data structure and (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  9.  50
    Tractable inference for probabilistic data models.Lehel Csato, Manfred Opper & Ole Winther - 2003 - Complexity 8 (4):64-68.
  10.  84
    Modeling Cultural Idea Systems: The Relationship between Theory Models and Data Models.Dwight Read - 2013 - Perspectives on Science 21 (2):157-174.
    Subjective experience is transformed into objective reality for societal members through cultural idea systems that can be represented with theory and data models. A theory model shows relationships and their logical implications that structure a cultural idea system. A data model expresses patterning found in ethnographic observations regarding the behavioral implementation of cultural idea systems. An example of this duality for modeling cultural idea systems is illustrated with Arabic proverbs that structurally link friend and enemy as concepts (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  11.  57
    Modeling the Cardiovascular-Respiratory Control System: Data, Model Analysis, and Parameter Estimation.Jerry J. Batzel & Mostafa Bachar - 2010 - Acta Biotheoretica 58 (4):369-380.
    Several key areas in modeling the cardiovascular and respiratory control systems are reviewed and examples are given which reflect the research state of the art in these areas. Attention is given to the interrelated issues of data collection, experimental design, and model application including model development and analysis. Examples are given of current clinical problems which can be examined via modeling, and important issues related to model adaptation to the clinical setting.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  12.  37
    Essential and mandatory part-whole relations in conceptual data models.C. Maria Keet - unknown
    A recurring problem in conceptual modelling and ontology development is the representation of part-whole relations, with a requirement to be able to distinguish between essential and mandatory parts. To solve this problem, we formally characterize the semantics of these shareability notions by resorting to the temporal conceptual model E RVT and its formalization in the description logic DLRUS.
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  13. Graphs in Linguistics: Diagrammatic Features and Data Models.Paolo Petricca - 2019 - In Matthieu Fontaine, Cristina Barés-Gómez, Francisco Salguero-Lamillar, Lorenzo Magnani & Ángel Nepomuceno-Fernández (eds.), Model-Based Reasoning in Science and Technology: Inferential Models for Logic, Language, Cognition and Computation. Springer Verlag.
    No categories
     
    Export citation  
     
    Bookmark  
  14. Using models to correct data: paleodiversity and the fossil record.Alisa Bokulich - 2018 - Synthese 198 (Suppl 24):5919-5940.
    Despite an enormous philosophical literature on models in science, surprisingly little has been written about data models and how they are constructed. In this paper, I examine the case of how paleodiversity data models are constructed from the fossil data. In particular, I show how paleontologists are using various model-based techniques to correct the data. Drawing on this research, I argue for the following related theses: first, the ‘purity’ of a data model (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   29 citations  
  15.  75
    Data without models merging with models without data.Ulrich Krohs & Werner Callebaut - 2007 - In Fred C. Boogerd, Frank J. Bruggeman, Jan-Hendrik S. Hofmeyr & Hans V. Westerhoff (eds.), Systems Biology: Philosophical Foundations. Boston: Elsevier. pp. 181--213.
    Systems biology is largely tributary to genomics and other “omic” disciplines that generate vast amounts of structural data. “Omics”, however, lack a theoretical framework that would allow using these data sets as such (rather than just tiny bits that are extracted by advanced data-mining techniques) to build explanatory models that help understand physiological processes. Systems biology provides such a framework by adding a dynamic dimension to merely structural “omics”. It makes use of bottom-up and top-down (...). The former are based on data about systems components, the latter on systems-level data. We trace back both modeling strategies (which are often used to delineate two branches of the field) to the modeling of metabolic and signaling pathways in the bottom-up case, and to biological cybernetics and systems theory in the top-down case. We then argue that three roots of systems biology must be discerned to account adequately for the structure of the field: pathway modeling, biological cybernetics, and “omics”. We regard systems biology as merging modeling strategies (supplemented by new mathematical procedures) from data-poor fields with data supply from a field that is quite deficient in explanatory modeling. After characterizing the structure of the field, we address some epistemological and ontological issues regarding concepts on which the top-down approach relies and that seem to us to require clarification. This includes the consequences of identifying modules in large networks without relying on functional considerations, the question of the “holism” of systems biology, and the epistemic value of the “systeome” project that aspires to become the cutting edge of the field. (shrink)
    Direct download  
     
    Export citation  
     
    Bookmark   57 citations  
  16. Data and phenomena in conceptual modelling.Benedikt Löwe & Thomas Müller - 2011 - Synthese 182 (1):131-148.
    The distinction between data and phenomena introduced by Bogen and Woodward (Philosophical Review 97(3):303–352, 1988) was meant to help accounting for scientific practice, especially in relation with scientific theory testing. Their article and the subsequent discussion is primarily viewed as internal to philosophy of science. We shall argue that the data/phenomena distinction can be used much more broadly in modelling processes in philosophy.
    Direct download (11 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  17. Reference Models: Using Models to Turn Data into Evidence.Teru Miyake - 2015 - Philosophy of Science 82 (5):822-832.
    Reference models of the earth’s interior play an important role in the acquisition of knowledge about the earth’s interior and the earth as a whole. Such models are used as a sort of standard reference against which data are compared. I argue that the use of reference models merits more attention than it has gotten so far in the literature on models, for it is an example of a method of doing science that has a (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  18.  21
    Big Data is not only about data: The two cultures of modelling.Giuseppe Alessandro Veltri - 2017 - Big Data and Society 4 (1).
    The contribution of Big Data to social science is not limited to data availability but includes the introduction of analytical approaches that have been developed in computer science, and in particular in machine learning. This brings about a new ‘culture’ of statistical modelling that bears considerable potential for the social scientist. This argument is illustrated with a brief discussion of model-based recursive partitioning which can bridge the theory and data-driven approach. Such a method is an example of (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  19. Model selection and the multiplicity of patterns in empirical data.James W. McAllister - 2007 - Philosophy of Science 74 (5):884-894.
    Several quantitative techniques for choosing among data models are available. Among these are techniques based on algorithmic information theory, minimum description length theory, and the Akaike information criterion. All these techniques are designed to identify a single model of a data set as being the closest to the truth. I argue, using examples, that many data sets in science show multiple patterns, providing evidence for multiple phenomena. For any such data set, there is more than (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  20. Workshop on Web-Based Massive Data Processing-Session 1-Streaming Data-Modelling and Guaranteeing Quality of Service over Data Streams.Shanshan Gu Wu & Yanfei Yu Lv - 2006 - In O. Stock & M. Schaerf (eds.), Lecture Notes In Computer Science. Springer Verlag. pp. 13-24.
    No categories
     
    Export citation  
     
    Bookmark  
  21.  28
    Neural Models for Imputation of Missing Ozone Data in Air-Quality Datasets.Ángel Arroyo, Álvaro Herrero, Verónica Tricio, Emilio Corchado & Michał Woźniak - 2018 - Complexity 2018:1-14.
    Ozone is one of the pollutants with most negative effects on human health and in general on the biosphere. Many data-acquisition networks collect data about ozone values in both urban and background areas. Usually, these data are incomplete or corrupt and the imputation of the missing values is a priority in order to obtain complete datasets, solving the uncertainty and vagueness of existing problems to manage complexity. In the present paper, multiple-regression techniques and Artificial Neural Network (...) are applied to approximate the absent ozone values from five explanatory variables containing air-quality information. To compare the different imputation methods, real-life data from six data-acquisition stations from the region of Castilla y León are gathered in different ways and then analyzed. The results obtained in the estimation of the missing values by applying these techniques and models are compared, analyzing the possible causes of the given response. (shrink)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  22.  78
    Models of data.Patrick Suppes - 2009 - In Ernest Nagel, Patrick Suppes & Alfred Tarski (eds.), Provability, Computability and Reflection. Stanford, CA, USA: Elsevier.
  23.  33
    Modelling perceptions of criminality and remorse from faces using a data-driven computational approach.Friederike Funk, Mirella Walker & Alexander Todorov - 2017 - Cognition and Emotion 31 (7):1431-1443.
    Perceptions of criminality and remorse are critical for legal decision-making. While faces perceived as criminal are more likely to be selected in police lineups and to receive guilty verdicts, faces perceived as remorseful are more likely to receive less severe punishment recommendations. To identify the information that makes a face appear criminal and/or remorseful, we successfully used two different data-driven computational approaches that led to convergent findings: one relying on the use of computer-generated faces, and the other on photographs (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  24.  34
    From Data to Causes III: Bayesian Priors for General Cross-Lagged Panel Models (GCLM).Michael J. Zyphur, Ellen L. Hamaker, Louis Tay, Manuel Voelkle, Kristopher J. Preacher, Zhen Zhang, Paul D. Allison, Dean C. Pierides, Peter Koval & Edward F. Diener - 2021 - Frontiers in Psychology 12:612251.
    This article describes some potential uses of Bayesian estimation for time-series and panel data models by incorporating information from prior probabilities (i.e., priors) in addition to observed data. Drawing on econometrics and other literatures we illustrate the use of informative “shrinkage” or “small variance” priors (including so-called “Minnesota priors”) while extending prior work on the general cross-lagged panel model (GCLM). Using a panel dataset of national income and subjective well-being (SWB) we describe three key benefits of these (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  25. A Model-Theoretic Approach for Recovering Consistent Data from Inconsistent Knowledge-Bases.Arnon Avron - unknown
    One of the most signi cant drawbacks of classical logic is its being useless in the presence of an inconsistency. Nevertheless, the classical calculus is a very convenient framework to work with. In this work we propose means for drawing conclusions from systems that are based on classical logic, although the informationmightbe inconsistent. The idea is to detect those parts of the knowledge-base that \cause" the inconsistency, and isolate the parts that are \recoverable". We do this by temporarily switching into (...)
     
    Export citation  
     
    Bookmark  
  26.  77
    Understanding climate phenomena with data-driven models.Benedikt Knüsel & Christoph Baumberger - 2020 - Studies in History and Philosophy of Science Part A 84 (C):46-56.
    In climate science, climate models are one of the main tools for understanding phenomena. Here, we develop a framework to assess the fitness of a climate model for providing understanding. The framework is based on three dimensions: representational accuracy, representational depth, and graspability. We show that this framework does justice to the intuition that classical process-based climate models give understanding of phenomena. While simple climate models are characterized by a larger graspability, state-of-the-art models have a higher (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  27.  28
    A Novel Efficiency Measure Model for Industrial Land Use Based on Subvector Data Envelope Analysis and Spatial Analysis Method.Wei Chen, Rui He & Qun Wu - 2017 - Complexity:1-11.
    With the rapid and unbalanced development of industry, a large amount of cultivated land is converted into industrial land with lower efficiency. The existing research is extensively concerned with industrial land use and industrial development in isolation, but little attention has been paid to the relationship between them. To help address this gap, the paper creates a new efficiency measure method for industrial land use combining Subvector Data Envelope Analysis with spatial analysis approach. The proposed model has been verified (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  28.  27
    Bayesian models of cognition revisited: Setting optimality aside and letting data drive psychological theory.Sean Tauber, Daniel J. Navarro, Amy Perfors & Mark Steyvers - 2017 - Psychological Review 124 (4):410-441.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  29.  11
    Satellite Data and Climate Models.Elisabeth A. Lloyd - 2018 - In Elisabeth A. Lloyd & Eric Winsberg (eds.), Climate Modelling: Philosophical and Conceptual Issues. Springer Verlag. pp. 65-71.
    In this brief chapter, Lloyd sets the stage for the following three papers, most centrally, Santer et al., which discusses whether the satellite data fit with climate models. Its target is a paper by Douglass et al., which claimed that satellite and weather balloon data showed that the climate models were wrong and could not be trusted. The Santer and Wigley “Fact Sheet” gives a nontechnical summary of what is wrong with the Douglass paper, while the (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  30.  44
    Emerging models of data governance in the age of datafication.Anna Berti Suman, Max Craglia, Marisa Ponti & Marina Micheli - 2020 - Big Data and Society 7 (2).
    The article examines four models of data governance emerging in the current platform society. While major attention is currently given to the dominant model of corporate platforms collecting and economically exploiting massive amounts of personal data, other actors, such as small businesses, public bodies and civic society, take also part in data governance. The article sheds light on four models emerging from the practices of these actors: data sharing pools, data cooperatives, public (...) trusts and personal data sovereignty. We propose a social science-informed conceptualisation of data governance. Drawing from the notion of data infrastructure we identify the models as a function of the stakeholders’ roles, their interrelationships, articulations of value, and governance principles. Addressing the politics of data, we considered the actors’ competitive struggles for governing data. This conceptualisation brings to the forefront the power relations and multifaceted economic and social interactions within data governance models emerging in an environment mainly dominated by corporate actors. These models highlight that civic society and public bodies are key actors for democratising data governance and redistributing value produced through data. Through the discussion of the models, their underpinning principles and limitations, the article wishes to inform future investigations of socio-technical imaginaries for the governance of data, particularly now that the policy debate around data governance is very active in Europe. (shrink)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   4 citations  
  31.  26
    Business Data Ethics: Emerging Models for Governing AI and Advanced Analytics.Dennis Hirsch, Timothy Bartley, Aravind Chandrasekaran, Davon Norris, Srinivasan Parthasarathy & Piers Norris Turner - 2023 - Springer.
    This open access book explains how leading business organizations attempt to achieve the responsible and ethical use of artificial intelligence (AI) and other advanced information technologies. These technologies can produce tremendous insights and benefits. But they can also invade privacy, perpetuate bias, and otherwise injure people and society. To use these technologies successfully, organizations need to implement them responsibly and ethically. The question is: how to do this? Data ethics management, and this book, provide some answers. -/- The authors (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  32.  39
    Data-Driven Model-Free Adaptive Control of Particle Quality in Drug Development Phase of Spray Fluidized-Bed Granulation Process.Zhengsong Wang, Dakuo He, Xu Zhu, Jiahuan Luo, Yu Liang & Xu Wang - 2017 - Complexity:1-17.
    A novel data-driven model-free adaptive control approach is first proposed by combining the advantages of model-free adaptive control and data-driven optimal iterative learning control, and then its stability and convergence analysis is given to prove algorithm stability and asymptotical convergence of tracking error. Besides, the parameters of presented approach are adaptively adjusted with fuzzy logic to determine the occupied proportions of MFAC and DDOILC according to their different control performances in different control stages. Lastly, the proposed fuzzy DDMFAC (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  33. What distinguishes data from models?Sabina Leonelli - 2019 - European Journal for Philosophy of Science 9 (2):22.
    I propose a framework that explicates and distinguishes the epistemic roles of data and models within empirical inquiry through consideration of their use in scientific practice. After arguing that Suppes’ characterization of data models falls short in this respect, I discuss a case of data processing within exploratory research in plant phenotyping and use it to highlight the difference between practices aimed to make data usable as evidence and practices aimed to use data (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   19 citations  
  34.  29
    Data-Driven Dialogue Models: Applying Formal and Computational Tools to the Study of Financial And Moral Dialogues.Olena Yaskorska-Shah - 2020 - Studies in Logic, Grammar and Rhetoric 63 (1):185-208.
    This paper proposes two formal models for understanding real-life dialogues, aimed at capturing argumentative structures performatively enacted during conversations. In the course of the investigation, two types of discourse with a high degree of well-structured argumentation were chosen: moral debate and financial communication. The research project found itself confronted by a need to analyse, structure and formally describe large volumes of textual data, where this called for the application of computational tools. It is expected that the results of (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  35. Large Language Models and the Reverse Turing Test.Terrence Sejnowski - 2023 - Neural Computation 35 (3):309–342.
    Large Language Models (LLMs) have been transformative. They are pre-trained foundational models that are self-supervised and can be adapted with fine tuning to a wide range of natural language tasks, each of which previously would have required a separate network model. This is one step closer to the extraordinary versatility of human language. GPT-3 and more recently LaMDA can carry on dialogs with humans on many topics after minimal priming with a few examples. However, there has been a (...)
    Direct download  
     
    Export citation  
     
    Bookmark   3 citations  
  36. Learning the structure of linear latent variable models.Peter Spirtes - unknown
    We describe anytime search procedures that (1) find disjoint subsets of recorded variables for which the members of each subset are d-separated by a single common unrecorded cause, if such exists; (2) return information about the causal relations among the latent factors so identified. We prove the procedure is point-wise consistent assuming (a) the causal relations can be represented by a directed acyclic graph (DAG) satisfying the Markov Assumption and the Faithfulness Assumption; (b) unrecorded variables are not caused by recorded (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  37. Data Synthesis for Big Questions: From Animal Tracks to Ecological Models.Rose Trappes - 2024 - Philosophy, Theory, and Practice in Biology 16 (1):4.
    This paper addresses a relatively new mode of ecological research: data synthesis studies. Data synthesis studies involve reusing data to create a general model as well as a reusable, aggregated dataset. Using a case from movement ecology, I analyse the trade-offs and strategies involved in data synthesis. Like theoretical ecological modelling, I find that synthesis studies involve a modelling trade-off between generality, precision and realism; they deal with this trade-off by adopting a pragmatic kludging strategy. I (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  38.  14
    Integration of Heterogeneous Biological Data in Multiscale Mechanistic Model Calibration: Application to Lung Adenocarcinoma.Claudio Monteiro, Adèle L’Hostis, Jim Bosley, Ben M. W. Illigens, Eliott Tixier, Matthieu Coudron, Emmanuel Peyronnet, Nicoletta Ceres, Angélique Perrillat-Mercerot & Jean-Louis Palgen - 2022 - Acta Biotheoretica 70 (3):1-24.
    Mechanistic models are built using knowledge as the primary information source, with well-established biological and physical laws determining the causal relationships within the model. Once the causal structure of the model is determined, parameters must be defined in order to accurately reproduce relevant data. Determining parameters and their values is particularly challenging in the case of models of pathophysiology, for which data for calibration is sparse. Multiple data sources might be required, and data may (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  39.  61
    Values and inductive risk in machine learning modelling: the case of binary classification models.Koray Karaca - 2021 - European Journal for Philosophy of Science 11 (4):1-27.
    I examine the construction and evaluation of machine learning binary classification models. These models are increasingly used for societal applications such as classifying patients into two categories according to the presence or absence of a certain disease like cancer and heart disease. I argue that the construction of ML classification models involves an optimisation process aiming at the minimization of the inductive risk associated with the intended uses of these models. I also argue that the construction (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  40.  30
    Data and Model Operations in Computational Sciences: The Examples of Computational Embryology and Epidemiology.Fabrizio Li Vigni - 2022 - Perspectives on Science 30 (4):696-731.
    Computer models and simulations have become, since the 1960s, an essential instrument for scientific inquiry and political decision making in several fields, from climate to life and social sciences. Philosophical reflection has mainly focused on the ontological status of the computational modeling, on its epistemological validity and on the research practices it entails. But in computational sciences, the work on models and simulations are only two steps of a longer and richer process where operations on data are (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  41.  39
    Computational models and empirical constraints.Zenon W. Pylyshyn - 1978 - Behavioral and Brain Sciences 1 (1):98-128.
    It is argued that the traditional distinction between artificial intelligence and cognitive simulation amounts to little more than a difference in style of research - a different ordering in goal priorities and different methodological allegiances. Both enterprises are constrained by empirical considerations and both are directed at understanding classes of tasks that are defined by essentially psychological criteria. Because of the different ordering of priorities, however, they occasionally take somewhat different stands on such issues as the power/generality trade-off and on (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   98 citations  
  42. Redundancy in Perceptual and Linguistic Experience: Comparing Feature-Based and Distributional Models of Semantic Representation.Brian Riordan & Michael N. Jones - 2011 - Topics in Cognitive Science 3 (2):303-345.
    Abstract Since their inception, distributional models of semantics have been criticized as inadequate cognitive theories of human semantic learning and representation. A principal challenge is that the representations derived by distributional models are purely symbolic and are not grounded in perception and action; this challenge has led many to favor feature-based models of semantic representation. We argue that the amount of perceptual and other semantic information that can be learned from purely distributional statistics has been underappreciated. We (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   25 citations  
  43.  28
    Correct data base: Wrong model?Alexander Marshack - 1993 - Behavioral and Brain Sciences 16 (4):767-768.
  44. Coherence and correspondence in the network dynamics of belief suites.Patrick Grim, Andrew Modell, Nicholas Breslin, Jasmine Mcnenny, Irina Mondescu, Kyle Finnegan, Robert Olsen, Chanyu An & Alexander Fedder - 2017 - Episteme 14 (2):233-253.
    Coherence and correspondence are classical contenders as theories of truth. In this paper we examine them instead as interacting factors in the dynamics of belief across epistemic networks. We construct an agent-based model of network contact in which agents are characterized not in terms of single beliefs but in terms of internal belief suites. Individuals update elements of their belief suites on input from other agents in order both to maximize internal belief coherence and to incorporate ‘trickled in’ elements of (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  45.  27
    Personalizing Human-Agent Interaction Through Cognitive Models.Tim Schürmann & Philipp Beckerle - 2020 - Frontiers in Psychology 11.
    Cognitive modeling of human behavior has advanced the understanding of underlying processes in several domains of psychology and cognitive science. In this article, we outline how we expect cognitive modeling to improve comprehension of individual cognitive processes in human-agent interaction and, particularly, human-robot interaction (HRI). We argue that cognitive models offer advantages compared to data-analytical models, specifically for research questions with expressed interest in theories of cognitive functions. However, the implementation of cognitive models is arguably more (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  46.  73
    A new approach to the formulation and testing of learning models.Joseph F. Hanna - 1966 - Synthese 16 (3-4):344 - 380.
    It is argued that current attempts to model human learning behavior commonly fail on one of two counts: either the model assumptions are artificially restricted so as to permit the application of mathematical techniques in deriving their consequences, or else the required complex assumptions are imbedded in computer programs whose technical details obscure the theoretical content of the model. The first failing is characteristic of so-called mathematical models of learning, while the second is characteristic of computer simulation models. (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  47.  9
    Model-Based Demography: Essays on Integrating Data, Technique and Theory.Thomas K. Burch - 2017 - Springer Verlag.
    Late in a career of more than sixty years, Thomas Burch, an internationally known social demographer, undertook a wide-ranging methodological critique of demography. This open access volume contains a selection of resulting papers, some previously unpublished, some published but not readily accessible [from past meetings of The International Union for the Scientific Study of Population and its research committees, or from other small conferences and seminars]. Rejecting the idea that demography is simply a branch of applied statistics, his work views (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  48.  10
    Data of Covid-19 Infection in Italy and Mathematical Models.Luigi Togliani - 2020 - Science and Philosophy 8 (2):165-180.
    In this paper I consider some data of Covid-19 infection in Italy from the 20th of February to the 29th of June 2020. Data are analyzed using some fits based on mathematical models. This analysis may be proposed to students of the last class of the Liceo Scientifico in order to debate a real problem with mathematical tools.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  49. Models of data and theoretical hypotheses: a case-study in classical genetics.Marion Vorms - 2010 - Synthese 190 (2):293-319.
    Linkage (or genetic) maps are graphs, which are intended to represent the linear ordering of genes on the chromosomes. They are constructed on the basis of statistical data concerning the transmission of genes. The invention of this technique in 1913 was driven by Morgan's group's adoption of a set of hypotheses concerning the physical mechanism of heredity. These hypotheses were themselves grounded in Morgan's defense of the chromosome theory of heredity, according to which chromosomes are the physical basis of (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  50.  12
    Model-based multidimensional clustering of categorical data.Tao Chen, Nevin L. Zhang, Tengfei Liu, Kin Man Poon & Yi Wang - 2012 - Artificial Intelligence 176 (1):2246-2269.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
1 — 50 / 975