Data Carpentry Data Organization and Manipulations

Data Carpentry’s aim is to teach researchers basic concepts, skills, and tools for working with data so that they can get more done in less time, and with less pain. The lessons below were designed for those interested in working with data in .

Authors:Tracy Teal, Adina Howe Contributors: Jennifer Bryan, Alexander Duryee, Jeffrey Hollister, Daisie Huang, Owen Jones, Clare Sloggett, Harriet Dashnow and Ben Marwick


Lesson status: Teaching

Lessons:

Cloud Computing

  1. Introduction to cloud computing
  2. Logging onto cloud
  3. Moving data
  4. Single analysis
  5. Parallel analysis
  6. Data roundtripping

Shell

  1. Importance of Data Organization
  2. Introduction to Ecoli evolution experiment
  3. Examining SRA runtable
  4. Unix organization
  5. Unix filesystem
  6. Seaching files
  7. Read QC
  8. Know your data
  9. Automate a workflow
  10. Variant calling workflow

R

The complete R for genomics analysis lesson is HERE


Requirements

Data Carpentry's teaching is hands-on, so participants are encouraged to use their own computers to insure the proper setup of tools for an efficient workflow.