Frictionless Data Frictionless Data
Introduction
Projects
Universe
Adoption
People
Fellows (opens new window)
  • Architecture
  • Roadmap
  • Process
  • Get Help
  • Contribute
  • Code of Conduct
  • Events Calendar
  • Forum (opens new window)
  • Chat (Slack) (opens new window)
  • Chat (Matrix) (opens new window)
Blog
Introduction
Projects
Universe
Adoption
People
Fellows (opens new window)
  • Architecture
  • Roadmap
  • Process
  • Get Help
  • Contribute
  • Code of Conduct
  • Events Calendar
  • Forum (opens new window)
  • Chat (Slack) (opens new window)
  • Chat (Matrix) (opens new window)
Blog

Frictionless Blog

pilot Tag [X]

  • Frictionless data packages for the NIH CFDE project
    January 12, 2022 by Lilly Winfree (OKF), Philippe Rocca-Serra (University of Oxford) on behalf of the NIH-CFDE
    Price icons created by Pixel perfect - Flaticon pilot

    The Frictionless Data team has been working with Dr. Philippe Rocca-Serra on increasing data set discoverability and highlighting how disparate data can be combined...

  • Frictionless Data and Dryad join forces to validate research data
    August 9, 2021 by Daniella Lowenberg and Lilly Winfree
    Price icons created by Pixel perfect - Flaticon pilot

    A great way to share research data is to upload it to a repository, but how do we ensure that uploaded data doesn't have issues? Frictionless Data and Dryad join forces to revamp the upload page for the Dryad application, now including the Frictionless validation functionality to check for data quality.

  • Dryad and Frictionless Data collaboration
    November 18, 2020 by Tracy Teal
    Price icons created by Pixel perfect - Flaticon pilot

    Announcing a new Pilot collaboration with the data repository Dryad...

  • Goodtables - Expediting the data submission and submitter feedback process
    September 16, 2020 by Adam Shepherd, Amber York, Danie Kinkade, and Lilly Winfree
    Price icons created by Pixel perfect - Flaticon pilot

    This post describes the second part of our Pilot collaboration with BCO-DMO...

  • Clarifying the semantics of data matrices and results tables - a Frictionless Data Pilot
    July 21, 2020 by Philippe Rocca-Serra and Lilly Winfree
    Price icons created by Pixel perfect - Flaticon pilot

    This Pilot will focus on removing the friction in reported scientific experimental results by applying the Data Package specifications...

  • Frictionless Public Utility Data - A Pilot Study
    March 18, 2020 by Zane Selvans, Christina Gosnell, and Lilly Winfree
    Price icons created by Pixel perfect - Flaticon pilot

    This blog post describes a Frictionless Data Pilot with the Public Utility Data Liberation project. Pilot projects are part of the Frictionless Data for Reproducible Research project. Written by Zane Selvans, Christina Gosnell, and Lilly Winfree.

  • Frictionless Data Pipelines for Ocean Science
    February 10, 2020 by Adam Shepherd, Amber York, Danie Kinkade, and Lilly Winfree
    Price icons created by Pixel perfect - Flaticon pilot

    This blog post describes a Frictionless Data Pilot with the Biological and Chemical Oceanography Data Management Office (BCO-DMO). Pilot projects are part of the Frictionless Data for Reproducible Research project. Written by the BCO-DMO team members Adam Shepherd, Amber York, Danie Kinkade, and development by Conrad Schloer.

  • Data Management for TEDDINET
    December 19, 2017 by Julian Padget (DM4T), Dan Fowler (OKI), Evgeny Kariv (OKI), Paul Walsh (OKI), Jo Barratt (OKI)
    Price icons created by Pixel perfect - Flaticon pilot

    Frictionless Data specifications for data legacy

  • Western Pennsylvania Regional Data Center
    December 15, 2017 by Adria Mecarder (OKI)
    Price icons created by Pixel perfect - Flaticon pilot

    Using ckanext-validation extension to highlight quality of datasets in CKAN-based open data portals.

  • UK Data Service
    December 12, 2017 by Brook Elgie (OKI), Paul Walsh (OKI)
    Price icons created by Pixel perfect - Flaticon pilot

    Using Frictionless Data software to assess and report on data quality and make a case for generating visualizations with ensuing data and metadata.