Searching the RRID Resource Information Network

Our searching services are busy right now. Please try again later

  • Register
X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

X

Leaving Community

Are you sure you want to leave this community? Leaving the community will revoke any permissions you have been granted in this community.

No
Yes
X
Forgot Password

If you have forgotten your password you can enter your email here and get a temporary password sent to your email.

SciCrunch Registry is a curated repository of scientific resources, with a focus on biomedical resources, including tools, databases, and core facilities - visit SciCrunch to register your resource.

Search

Type in a keyword to search

On page 7 showing 121 ~ 140 out of 346 results
Snippet view Table view Download 346 Result(s)
Click the to add this resource to a Collection

https://edspace.american.edu/openbehavior/project/3d-printed-superfusion-chamber/

Project consists of suite of 3D-printed tools including superfusion chamber with independent brain presser and animal stand. Modular device that facilitates transfer of pharmacological agents and brain access for electrodes and/or lenses for imaging. Provides stable conditions for electrophysiological and neuroimaging recordings. Device can be print either in-house or through 3d-printing service.

Proper citation: 3D Printed Superfusion Chamber (RRID:SCR_022901) Copy   


https://github.com/LINCellularNeuroscience/VAME

Software Python tool to cluster behavioral signals obtained from pose estimation tools. Unsupervised probabilistic deep learning framework capable of finding behavioral motifs in pose estimation data. Capable of augmenting quantitative behavioral analyses of data derived from standard pose-estimation software packages. Written based on core functions from DeepLabCut, and readily works with pose data from that package. Can also work with data from other pose estimation packages such as SLEAP.

Proper citation: Variational Embedding of Animal Motion (RRID:SCR_022477) Copy   


  • RRID:SCR_023061

    This resource has 1+ mentions.

https://github.com/spoonsso/dannce

Convolutional neural network that calculates 3D positions of user-defined anatomical landmarks on behaving animals from videos taken at multiple angles. Works on recorded video data from multiple cameras. Used for 3D markerless pose estimation and tracking.

Proper citation: DANNCE (RRID:SCR_023061) Copy   


  • RRID:SCR_021585

    This resource has 1+ mentions.

https://edspace.american.edu/openbehavior/project/argus/

Portal provides software tool for analysis and quantification of both single and socially interacting zebrafish. Software data extraction and analysis tool built in open source R language for tracking zebrafish behavior.

Proper citation: Argus (RRID:SCR_021585) Copy   


  • RRID:SCR_021402

https://github.com/NeLy-EPFL/DeepFly3D

Software tool as PyTorch and PyQT5 implementation of 2D-3D tethered Drosophila pose estimation. Image annotation tool used for pose estimation and appendage tracking. Provides interface for pose estimation and to permit further correction of 2D pose estimates, which are automatically converted to 3D pose.

Proper citation: DeepFly3D (RRID:SCR_021402) Copy   


  • RRID:SCR_021528

https://github.com/LaubachLab/MedParse

Software tool to read MedPC data into Python and Matlab. MedPC code for saving precise times of behavioral events and MatLab and Python functions to convert MedPC data into time event codes.

Proper citation: MedParse (RRID:SCR_021528) Copy   


  • RRID:SCR_024480

https://github.com/danbider/lightning-pose

Software video centric package for direct video manipulation. Semi supervised animal pose estimation algorithm, Bayesian post processing approach and deep learning package. Improved animal pose estimation via semi-supervised learning, Bayesian ensembling, and cloud-native open-source tools.

Proper citation: Lightning Pose (RRID:SCR_024480) Copy   


  • RRID:SCR_021384

    This resource has 1+ mentions.

https://github.com/gordonberman/MotionMapper

Software tool to work on videos (not poses).Uses image analysis and multivariate statistical methods to find stereotyped behaviors in flies, mice, and other species.

Proper citation: MotionMapper (RRID:SCR_021384) Copy   


https://github.com/YttriLab/B-SOID

Software pipeline that pairs unsupervised pattern recognition with supervised classification to achieve fast predictions of behaviors that are not predefined by users.Unsupervised learning algorithm that discovers and classifies actions based on inherent statistics of data points provided.

Proper citation: Behavioral Segmentation of Open-field in DeepLabCut (RRID:SCR_021385) Copy   


  • RRID:SCR_021461

https://edspace.american.edu/openbehavior/project/attys/

Portal related to biomedical signals.Provides wearable data acquisition device with special focus on biomedical signals such as heart activity (ECG), muscle activity (EMG) and brain activity (EEG).Open source high precision bluetooth data acquisition device.

Proper citation: Attys project (RRID:SCR_021461) Copy   


  • RRID:SCR_021462

https://edspace.american.edu/openbehavior/project/autonomouse/

Project related to self initiated conditioning and behavior tracking in mice. Provides automated system that can track large numbers (over 25) of socially housed mice through implanted RFID chips on mice. With RFID trackers and other analyses, behavior of mice can be tracked as they train and are subsequently tested on (or self-initiate testing in) odor discrimination task over months with thousands of trials performed every day.

Proper citation: AutonoMouse project (RRID:SCR_021462) Copy   


  • RRID:SCR_021460

https://edspace.american.edu/openbehavior/project/autoreward2/

Project related to detection and rewarding rodents in modified T-maze task. System designed to detect and reward rodents in modified T maze provided by Ruhr University Bochum, Germany scientists.

Proper citation: Autoreward2 project (RRID:SCR_021460) Copy   


https://edspace.american.edu/openbehavior/project/bpm-biosignals/

Portal as educational resource regarding construction and implementation of bioamplifiers as well as interpretation of biosignals.

Proper citation: BPM Biosignals project (RRID:SCR_021459) Copy   


https://edspace.american.edu/openbehavior/project/automated-mouse-homecage-two-bottle-choice-test/

Project related to assessing preferences by mice among fluids in their homecages.Provided system includes homecage fitted apparatus for automated, photobeam based detection of licks in two bottle choice task. Used for automated mouse homecage two bottle choice test.

Proper citation: Automated mouse homecage two bottle choice test project (RRID:SCR_021457) Copy   


https://edspace.american.edu/openbehavior/project/closed-loop-system/

Portal includes system for combined behavioral tracking, electrophysiology, and closed loop stimulation provided by University of Oslo scientists.Integrates Bonsai and Open Ephys with multiple modules.Provides guide for setting up this system.

Proper citation: Closed Loop System project (RRID:SCR_021475) Copy   


https://edspace.american.edu/openbehavior/project/usv-detector/

Project related to ultrasonic vocalizations detection. Provides device designed for automatic detection of 50kHz ultrasonic vocalizations.

Proper citation: Ultrasonic Vocalizations Detector project (RRID:SCR_021471) Copy   


  • RRID:SCR_021504

    This resource has 1+ mentions.

https://ethowatcher.paginas.ufsc.br/

Software tool for behavioral and video tracking analysis in laboratory animals. Used to support detailed ethography, video tracking, and extraction of kinematic variables from digital video files of laboratory animals.

Proper citation: EthoWatcher (RRID:SCR_021504) Copy   


https://www.feldmanlab.org/

Project related to function of cerebral cortex. Included model system for studying cortical function is provided by UC Berkeley scientists. System includes lickometer which employs infrared beam and sensor to minimize electrical noise artifacts during neurophysiology experiments and can be easily mounted in micromanipulator for precise and repeatable positioning.Open source lickometer was designed in conjunction with open source water delivery system.Together, these provide basic hardware for DIY behavioral assay and reward system for mice.

Proper citation: Feldman Lab Lickometer project (RRID:SCR_021469) Copy   


  • RRID:SCR_021503

https://github.com/arnefmeyer/mousecam

Software Python package with functions for extracting and analyzing data recorded using camera system.Used to monitor behaviors including eye position, whisking, and ear movements in unrestrained animals.Can be mounted in combination with neural implants for recording brain activity.

Proper citation: Mousecam (RRID:SCR_021503) Copy   


https://edspace.american.edu/openbehavior/project/hao-chen-lab-repository/

Project aims to establish computing platform for rodent behavior research using Raspberry Pi computer. They have built several devices for conducting operant conditioning and monitoring environmental data.

Proper citation: Hao Chen Lab Repository project (RRID:SCR_021467) Copy   



Can't find your Tool?

We recommend that you click next to the search bar to check some helpful tips on searches and refine your search firstly. Alternatively, please register your tool with the SciCrunch Registry by adding a little information to a web form, logging in will enable users to create a provisional RRID, but it not required to submit.

Can't find the RRID you're searching for? X
  1. NIDDK Information Network Resources

    Welcome to the dkNET Resources search. From here you can search through a compilation of resources used by dkNET and see how data is organized within our community.

  2. Navigation

    You are currently on the Community Resources tab looking through categories and sources that dkNET has compiled. You can navigate through those categories from here or change to a different tab to execute your search through. Each tab gives a different perspective on data.

  3. Logging in and Registering

    If you have an account on dkNET then you can log in from here to get additional features in dkNET such as Collections, Saved Searches, and managing Resources.

  4. Searching

    Here is the search term that is being executed, you can type in anything you want to search for. Some tips to help searching:

    1. Use quotes around phrases you want to match exactly
    2. You can manually AND and OR terms to change how we search between words
    3. You can add "-" to terms to make sure no results return with that term in them (ex. Cerebellum -CA1)
    4. You can add "+" to terms to require they be in the data
    5. Using autocomplete specifies which branch of our semantics you with to search and can help refine your search
  5. Save Your Search

    You can save any searches you perform for quick access to later from here.

  6. Query Expansion

    We recognized your search term and included synonyms and inferred terms along side your term to help get the data you are looking for.

  7. Collections

    If you are logged into dkNET you can add data records to your collections to create custom spreadsheets across multiple sources of data.

  8. Sources

    Here are the sources that were queried against in your search that you can investigate further.

  9. Categories

    Here are the categories present within dkNET that you can filter your data on

  10. Subcategories

    Here are the subcategories present within this category that you can filter your data on

  11. Further Questions

    If you have any further questions please check out our FAQs Page to ask questions and see our tutorials. Click this button to view this tutorial again.

X