Direkt zum Inhalt

Diese Website setzt ausschließlich technisch notwendige Cookies ein, die spätestens mit dem Schließen Ihres Browsers gelöscht werden. Wenn Sie mehr über Cookies erfahren möchten, klicken Sie bitte auf die Datenschutzerklärung.

DE EN
Anmelden
Logo, zur Startseite
  1. Sie sind hier:
  2. Muscles in Time
...

    Datenpaket: Muscles in Time

    • RADAR-Metadaten
    • Inhalt
    • Statistiken
    • Technische Metadaten
    Alternativer Identifier:
    -
    Verwandter Identifier:
    -
    Ersteller/in:
    Schneider, David https://orcid.org/0000-0002-3272-2337 [cv:hci Lab - KIT]
    Beitragende:
    (Researcher)
    Schneider, David https://orcid.org/0000-0002-3272-2337 [cv:hci Lab]

    (Researcher)
    Reiß, Simon https://orcid.org/0000-0003-1953-6211 [cv:hci Lab - KIT]

    (Researcher)
    Kugler, Marco [Kugler, Marco]

    (Researcher)
    Jaus, Alexander https://orcid.org/0000-0002-0669-0300 [cv:hci Lab - KIT]

    (Researcher)
    Peng, Kunyu https://orcid.org/0000-0002-5419-9292 [cv:hci Lab - KIT]

    (Researcher)
    Sutschet, Susanne https://orcid.org/0009-0009-4085-0003 [IPEK - KIT]

    (Researcher)
    Sarfraz, M. Saquib https://orcid.org/0000-0002-1271-0005 [Mercedes-Benz Tech Innovation]

    (Researcher)
    Matthiesen, Sven https://orcid.org/0000-0001-5978-694X [IPEK - KIT]

    (Researcher)
    Stiefelhagen, Rainer https://orcid.org/0000-0001-8046-4945 [cv:hci Lab - KIT]
    Titel:
    Muscles in Time
    Weitere Titel:
    (Subtitle) Learning to Understand Human Motion by Simulating Muscle Activations
    Beschreibung:
    (Abstract) Exploring the intricate dynamics between muscular and skeletal structures is pivotal for understanding human motion. However, acquiring ground truth muscle activation data is resource-intensive and results in a scarcity of datasets. The Muscles in Time (MinT) dataset aims to address this by introduc... Exploring the intricate dynamics between muscular and skeletal structures is pivotal for understanding human motion. However, acquiring ground truth muscle activation data is resource-intensive and results in a scarcity of datasets. The Muscles in Time (MinT) dataset aims to address this by introducing a large-scale synthetic muscle activation dataset. MinT is created by enriching existing motion capture datasets with muscle activation simulations from biomechanical models using the OpenSim platform, a widely accepted tool in biomechanics and human motion research. Neural networks designed for human motion understanding have historically relied on indirect data, like video or motion capture, similar to prisoners in Plato's cave who see only shadows rather than the true objects. Current systems, despite advances in capturing human motion, do not account for the complex inner mechanics—particularly the muscle activations driving human movement. These activations are key to understanding physical exertion and motion difficulty but are often overlooked due to the limitations of traditional data collection methods such as EMG. To overcome these challenges, our dataset, MinT, incorporates simulations that provide detailed muscle activation information. Starting from simple pose sequences, we extract fine-grained muscle activation timings and interactions within the human musculoskeletal system. MinT contains over nine hours of simulation data, covering 227 subjects and 402 muscle strands, offering a comprehensive and scalable resource for further research into human motion.

    Exploring the intricate dynamics between muscular and skeletal structures is pivotal for understanding human motion. However, acquiring ground truth muscle activation data is resource-intensive and results in a scarcity of datasets. The Muscles in Time (MinT) dataset aims to address this by introducing a large-scale synthetic muscle activation dataset. MinT is created by enriching existing motion capture datasets with muscle activation simulations from biomechanical models using the OpenSim platform, a widely accepted tool in biomechanics and human motion research. Neural networks designed for human motion understanding have historically relied on indirect data, like video or motion capture, similar to prisoners in Plato's cave who see only shadows rather than the true objects. Current systems, despite advances in capturing human motion, do not account for the complex inner mechanics—particularly the muscle activations driving human movement. These activations are key to understanding physical exertion and motion difficulty but are often overlooked due to the limitations of traditional data collection methods such as EMG. To overcome these challenges, our dataset, MinT, incorporates simulations that provide detailed muscle activation information. Starting from simple pose sequences, we extract fine-grained muscle activation timings and interactions within the human musculoskeletal system. MinT contains over nine hours of simulation data, covering 227 subjects and 402 muscle strands, offering a comprehensive and scalable resource for further research into human motion.

    Zeige alles

    (Other) Please find additional information about this dataset and its usage under https://simplexsigil.github.io/mint

    Please find additional information about this dataset and its usage under https://simplexsigil.github.io/mint

    Schlagworte:
    Human motion understanding
    Muscle activation
    Synthetic data
    OpenSim
    Biomechanical modeling
    Sequence-to-sequence models
    Musculoskeletal system
    MinT dataset
    Neural networks
    Ground reaction forces (GRFs)
    Simulation
    Zugehörige Informationen:
    (URL) https://simplexsigil.github.io/mint
    Sprache:
    Englisch
    Herausgeber/in:
    Computer Vision for Human Computer Interactions Lab (cv:hci), Institute for Anthropomatics and Robotics (IAR), Karlsruhe Institute of Technology
    Erstellungsjahr:
    2024
    Fachgebiet:
    Computer Science
    Objekttyp:
    Dataset
    Datenquelle:
    -
    Verwendete Software:
    Software für Datenbearbeitung
    Software:
    OpenSim - 4.0
    Alternative Software:
    -
    Datenverarbeitung:
    Simulation of muscle activations on basis of motion capture data
    Erscheinungsjahr:
    2024
    Rechteinhaber/in:
    Computer Vision for Human Computer Interactions Lab (cv:hci), Institute for Anthropomatics and Robotics (IAR), Karlsruhe Institute of Technology
    Förderung:
    Carl-Zeiss-Stiftung - (JuBot - Jung bleiben mit Robotern)
    Zeige alles Zeige weniger
    Name Speichervolumen Metadaten Upload Aktion
    Status:
    Publiziert
    Eingestellt von:
    cac2349627c2b7f93e603d4fb2331d8b
    Erstellt am:
    2024-06-11
    Archivierungsdatum:
    2024-12-06
    Archivgröße:
    2,1 GB
    Archiversteller:
    cac2349627c2b7f93e603d4fb2331d8b
    Archiv-Prüfsumme:
    5382f696586815980ecbb577de8795bf (MD5)
    Embargo-Zeitraum:
    -
    DOI: 10.35097/VDPCEFSThBWlDPFL
    Publikationsdatum: 2024-12-06
    Datenpaket herunterladen
    Herunterladen (2,1 GB)

    Metadaten herunterladen
    Statistik
    0
    Views
    0
    Downloads
    Lizenz für das Datenpaket
    Dieses Werk ist lizenziert unter
    # Licenses ## MinT If you use this data, we kindly ask you to cite our paper: ``` @inproceedings{schneider2024muscles, title={Muscles in Time: Learning to Understand Human Motion In-Depth by Simulating Muscle Activations}, author={Schneider, David and Rei{\ss}, Simon and Kugler, Marco and Jaus, Alexander and Peng, Kunyu and Sutschet, Susanne and Sarfraz, M Saquib and Matthiesen, Sven and Stiefelhagen, Rainer}, booktitle={The Thirty-eight Conference on Neural Information Processing Systems Datasets and Benchmarks Track}, year={2024} } ``` ### License The MinT dataset is published under a Creative Commons CC BY-NC 4.0 License as defined here: [https://creativecommons.org/licenses/by-nc/4.0/](https://creativecommons.org/licenses/by-nc/4.0/) MinT builds on several motion capture datasets, the licenses of which are listed below. ## KIT Whole-Body Human Motion Database If you use this data, the creators of the dataset would kindly ask you to cite their work in your project: ``` @inproceedings{AMASS_KIT-CNRS-EKUT-WEIZMANN, author = {Christian Mandery and\"Omer Terlemez and Martin Do and Nikolaus Vahrenkamp and Tamim Asfour}, title = {The {KIT} Whole-Body Human Motion Database}, booktitle = {International Conference on Advanced Robotics (ICAR)}, pages = {329--336}, year = {2015}, } @article{AMASS_KIT-CNRS-EKUT-WEIZMANN-2, author = {Christian Mandery and\"Omer Terlemez and Martin Do and Nikolaus Vahrenkamp and Tamim Asfour}, title = {Unifying Representations and Large-Scale Whole-Body Motion Databases for Studying Human Motion}, pages = {796--809}, volume = {32}, number = {4}, journal = {IEEE Transactions on Robotics}, year = {2016}, } @inproceedings{AMASS_KIT-CNRS-EKUT-WEIZMANN-3, author = {Franziska Krebs and Andre Meixner and Isabel Patzer and Tamim Asfour}, title = {The {KIT} Bimanual Manipulation Dataset}, booktitle = {IEEE/RAS International Conference on Humanoid Robots (Humanoids)}, pages = {499--506}, year = {2021}, } ``` ## Total Capture If you use this data, the creators of the dataset would kindly ask you to cite their work in your project: ``` @inproceedings{AMASS_TotalCapture, author = {Trumble, Matt and Gilbert, Andrew and Malleson, Charles and Hilton, Adrian and Collomosse, John}, title = {{Total Capture}: 3D Human Pose Estimation Fusing Video and Inertial Sensors}, booktitle = {2017British Machine Vision Conference (BMVC)}, year = {2017} } ``` ### License The datasets are free for research use only. This agreement must be confirmed by a senior representative of your organisation. To access and use this data you agree to the following conditions: Multiple view video sets and associated data files will be used for research purposes only. The dataset source should be acknowledged in all publications and publicity material which it is used or results derived from the data are used by referencing the relevant publication as indicated for specific datasets and inclusion of the repository link The datasets should not be used for commercial purposes. The data should not be redistributed. To request access to the TotalCapture Dataset , or for other queries please contact: a.gilbert@surrey.ac.uk ## EyesJapanDataset If you use this data, the creators of the dataset would kindly ask you to cite their work in your project: ``` @misc{AMASS_EyesJapanDataset, title = {{Eyes Japan MoCap Dataset}}, author = {Eyes JAPAN Co. Ltd.}, url = {http://mocapdata.com} } ``` When you use our data, please show our credit as: ``` This motion capture data is licensed by mocapdata.com, Eyes, JAPAN Co. Ltd. under the Creative Commons Attribution 2.1 Japan License. To view a copy of this license, contact mocapdata.com, Eyes, JAPAN Co. Ltd. or visit http://creativecommons.org/licenses/by/2.1/jp/. http://mocapdata.com/ (C) Copyright Eyes, JAPAN Co. Ltd. 2008-2009. ``` Your credit is only our hope, thanks for supporting Free culture! Please let us know your works using our data so that we can introduce on our site, too. This data is under CC license, Attribution-Share Alike 2.1 Japan You are free: * to Share to copy, distribute and transmit the work * to Remix to adapt the work Under the following conditions: * Attribution: You must attribute the work in the manner specified by the author or licensor. * Share Alike: If you alter, transform, or build upon this work, you may distribute the resulting work only under the same or similar licence to this one. For more detail, please refer to: [http://creativecommons.org/licenses/by-sa/2.1/jp/deed.en](http://creativecommons.org/licenses/by-sa/2.1/jp/deed.en) If you have any questions and comments, please contact info@mocapdata.com. Also, if you need your original mocap data or data use without credit, please feel free to contact us. ## BMLRub ### Data License For non-commercial and scientfc research purposes Please read carefully the following terms and conditons and any accompanying documentaton before you download and/or use the BMLrub data (the "Data"). By downloading and/or using the Data, you acknowledge that you have read these terms and conditons, understand them, and agree to be bound by them. If you do not agree with these terms and conditons, you must not download and/or use the Data. ### Ownership The Data has been developed at the BioMoton Lab in 2002 when it was hosted by the Ruhr University Bochum and was funded by the Volkswagen Foundaton. It is owned by and proprietary material of Nikolaus Troje, Director of the BioMoton Lab, which is currently at York University, Toronto, Canada. ### License Grant Nikolaus Troje grants you a non-exclusive, non-transferable, free of charge right: To install the Data on computers owned, leased or otherwise controlled by you and/or your organisaton; To use the Data for the sole purpose of performing non-commercial scientfc research, noncommercial educaton, or non-commercial artstc projects. Any other use, in partcular any use for commercial purposes, is prohibited. This includes, without limitaton, incorporaton in a commercial product, use in a commercial service, or producton of other artefacts for commercial purposes including, for example, 3D models, movies, or video games. The Data may not be reproduced, modifed and/or made available in any form to any third party without prior writen permission from Nikolaus Troje. ### Human Subjects Data The Data was captured using a populaton of students and staf from Ruhr University Bochum. All partcipants gave their informed, writen, consent for the scientfc analysis and publicaton of their 3D moton capture data. ### Disclaimer of Representatons and Warrantes You expressly acknowledge and agree that the Data results from basic research, is provided "AS IS", may contain errors, and that any use of the Data is at your sole risk. The BioMoton Lab makes no representatons or warrantes of any kind concerning the Data neither expressed nor implied, an the absence of any legal or actual defects, whether discoverable of not. Specifcally, and not to limit the foregoing, the BioMoton Lab makes no representatons or warrantes (i) regarding the merchantability or ftness for a partcular purpose of the Data, (ii) that the use of the Data will not infringe any patents, copyrights or other intellectual property rights of a third party, and (iii) that the use of the Data will not cause any damage of any kind to you or a third party. ### Limitaton of Liability Under no circumstances shall N. Troje or the BioMoton Lab be liable for any incidental, special, indirect or consequental damages arising out of or relatng to this license, including but not limited to, any lost profts, business interrupton, loss of programs or other data, or all other commercial damages or losses, even if advised of the possibility thereof. ### No Maintenance Services You understand and agree that the BioMoton Lab is under no obligaton to provide either maintenance services, update services, notces of latent defects, or correctons of defects with regard to the Data. The BioMoton Lab nevertheless reserves the right to update, modify, or discontnue the Data at any tme. ### Publicaton with BMLrub data You agree to cite relevant papers published by the BioMoton Lab. This website lists the most up to date bibliographic informaton on the front page. ### Commercial licensing opportunites For commercial uses, please contact the BioMoton Lab at troje@yorku.ca. ## BMLMovi If you use this data, the creators of the dataset would kindly ask you to cite their work in your project: ``` @article{AMASS_BMLmovi, title = {{MoVi}: A Large Multipurpose Motion and Video Dataset}, author = {Saeed Ghorbani and Kimia Mahdaviani and Anne Thaler and Konrad Kording and Douglas James Cook and Gunnar Blohm and Nikolaus F. Troje}, year = {2020}, journal = {arXiv preprint arXiv: 2003.01888} } ``` ### Data License For non-commercial and scientific research purposes Please read carefully the following terms and conditions and any accompanying documentation before you download and/or use the BMLmovi Data (the “Data”). By downloading and/or using the Data, you acknowledge that you have read these terms and conditions, understand them, and agree to be bound by them. If you do not agree with these terms and conditions, you must not download and/or use the Data. Any infringement of the terms of this agreement will automatically terminate your rights under this License. ### Ownership The Data was collected between July and Dec 2018 at the BioMotion Lab. The funding was provided by NSERC and CFREF VISTA. Data is owned by and proprietary material of Prof. Dr. Nikolaus Troje, Director of the BioMotion Lab, which is currently located at York University, Toronto, Canada. % ### License Grant Nikolaus Troje grants you (Licensee) a non-exclusive, non-transferable, free of charge right: To obtain and install the Data on computers owned, leased or otherwise controlled by you and/or your organization; To use the Data for the sole purpose of performing non-commercial scientific research, non-commercial education, or non-commercial artistic projects; To modify, adapt, translate or create derivative works based upon the Any other use, in particular any use for commercial purposes, is prohibited. This includes, without limitation, incorporation in a commercial product, use in a commercial service, or production of other artefacts for commercial purposes including, for example, 3D models, pictures, movies, or video games. The Data may not be reproduced, modified and/or made available in any form to any third party without prior written permission from Nikolaus Troje. The Data may not be used for pornographic purposes or to generate pornographic material whether commercial or not. This license also prohibits the use of the Data to train methods/algorithms/neural networks/etc. for commercial use of any kind. By downloading the Data, you agree not to reverse engineer it. ### No Distribution The Data and the License herein granted shall not be copied, shared, distributed, re-sold, offered for re-sale, transferred or sub-licensed in whole or in part, except that you may make one copy for archive purposes only. ### Human Subjects Data The Data was captured using a population of students and staff from Queen’s University, Kingston, Canada. All participants gave their informed, written, consent for the scientific analysis and publication of their video Data, IMU Data, and 3D motion capture Data. ### Disclaimer of Representations and Warranties You expressly acknowledge and agree that the Data results from basic research, is provided “AS IS”, may contain errors, and that any use of the Data is at your sole risk. Nikolaus Troje and the BioMotion Lab make no representations or warranties of any kind concerning the Data neither expressed nor implied, and the absence of any legal or actual defects, whether discoverable or not. Specifically, and not to limit the foregoing, Nikolaus Troje and the BioMotion Lab make no representations or warranties (i) regarding the merchantability or fitness for a particular purpose of the Data, (ii) that the use of the Data will not infringe any patents, copyrights or other intellectual property rights of a third party, and (iii) that the use of the Data will not cause any damage of any kind to you or a third party. ### Limitation of Liability Under no circumstances shall Nikolaus Troje or the BioMotion Lab be liable for any incidental, special, indirect or consequential damages arising out of or relating to this license, including but not limited to, any lost profits, business interruption, loss of programs or other Data, or all other commercial damages or losses, even if advised of the possibility thereof. ### No Maintenance Services You understand and agree that Nikolaus Troje and the BioMotion Lab are under no obligation to provide either maintenance services, update services, notices of latent defects, or corrections of defects with regard to the Data. The BioMotion Lab nevertheless reserves the right to update, modify, or discontinue the Data at any time. Defects of the Data must be notified in writing to the BioMotion Lab with a comprehensible description of the error symptoms (Email: movi@yorku.ca). The notification of the defect should enable the reproduction of the error. The Licensee is encouraged to communicate any use, results, modification or publication. ### Publications using BMLmovi Data You acknowledge that the Data is a valuable scientific resource and agree to appropriately cite the most recent paper describing the BMLmovi database in any publication making use of the Data. (Note: Citing the dataset URL instead of the publication(s) would not be compliant with this license agreement). Note that part of the BMLmovi Data is included in the Archive of Motion Capture as Surface Shapes (AMASS; [https://amass.is.tue.mpg.de/](https://amass.is.tue.mpg.de/) ) and was processed using methods introduced by AMASS. Using this part of the dataset falls under the AMASS license agreement and requires you to appropriately cite the AMASS publicationby N Mahmood, N Ghorbani, NF Troje, G Pons-Moll, and MJ Black (2019). ### Acknowledgements We wish to thank Nima Ghorbani for post-processing our motion capture data so it could be added to the AMASS dataset , and all others authors of AMASS for their approval to add the processed data to our dataset. We further wish to thank Viswaijt Kumar for his help with post-processing the data, setting up the data repository and designing and managing the website. This research was funded by a NSERC Discovery Grant and contributions from CFREF VISTA to NFT. ## AMASS ### Dataset Copyright License for non-commercial scientific research purposes Please read carefully the following terms and conditions and any accompanying documentation before you download and/or use the AMASS Dataset and the accompanying Software (jointly refered to as the "Dataset"). By downloading and/or using the Dataset, you acknowledge that you have read these terms and conditions, understand them, and agree to be bound by them. If you do not agree with these terms and conditions, you must not download and/or use the Dataset. Any infringement of the terms of this agreement will automatically terminate your rights under this License ### Ownership / Licensees The Dataset and the associated materials has been developed at the Max Planck Institute for Intelligent Systems (hereinafter "MPI"). Any copyright or patent right is owned by and proprietary material of the Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. (hereinafter “MPG”; MPI and MPG hereinafter collectively “Max-Planck”) hereinafter the “Licensor”. ### License Grant Licensor grants you (Licensee) personally a single-user, non-exclusive, non-transferable, free of charge right: * To obtain and install the Dataset on computers owned, leased or otherwise controlled by you and/or your organization; * To use the Dataset for the sole purpose of performing non-commercial scientific research, non-commercial education, or non-commercial artistic projects; Any other use, in particular any use for commercial purposes, is prohibited. This includes, without limitation, incorporation in a commercial product, use in a commercial service, or production of other artefacts for commercial purposes. The Dataset may not be reproduced, modified and/or made available in any form to any third party without Max-Planck’s prior written permission. The Dataset may not be used for pornographic purposes or to generate pornographic material whether commercial or not. This license also prohibits the use of the Dataset to train methods/algorithms/neural networks/etc. for commercial use of any kind. By downloading the Dataset, you agree not to reverse engineer it. ### No Distribution The Dataset and the license herein granted shall not be copied, shared, distributed, re-sold, offered for re-sale, transferred or sub-licensed in whole or in part except that you may make one copy for archive purposes only. ### Disclaimer of Representations and Warranties You expressly acknowledge and agree that the Dataset results from basic research, is provided “AS IS”, may contain errors, and that any use of the Dataset is at your sole risk. LICENSOR MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND CONCERNING THE SOFTWARE, NEITHER EXPRESS NOR IMPLIED, AND THE ABSENCE OF ANY LEGAL OR ACTUAL DEFECTS, WHETHER DISCOVERABLE OR NOT. Specifically, and not to limit the foregoing, licensor makes no representations or warranties (i) regarding the merchantability or fitness for a particular purpose of the Dataset, (ii) that the use of the Dataset will not infringe any patents, copyrights or other intellectual property rights of a third party, and (iii) that the use of the Dataset will not cause any damage of any kind to you or a third party. ### Limitation of Liability Because this Software License Agreement qualifies as a donation, according to Section 521 of the German Civil Code (Bürgerliches Gesetzbuch – BGB) Licensor as a donor is liable for intent and gross negligence only. If the Licensor fraudulently conceals a legal or material defect, they are obliged to compensate the Licensee for the resulting damage. Licensor shall be liable for loss of data only up to the amount of typical recovery costs which would have arisen had proper and regular data backup measures been taken. For the avoidance of doubt Licensor shall be liable in accordance with the German Product Liability Act in the event of product liability. The foregoing applies also to Licensor’s legal representatives or assistants in performance. Any further liability shall be excluded. Patent claims generated through the usage of the Dataset cannot be directed towards the copyright holders. The contractor points out that add-ons as well as minor modifications to the Dataset may lead to unforeseeable and considerable disruptions. ### No Maintenance Services You understand and agree that Licensor is under no obligation to provide either maintenance services, update services, notices of latent defects, or corrections of defects with regard to the Dataset. Licensor nevertheless reserves the right to update, modify, or discontinue the Dataset at any time. Defects of the Dataset must be notified in writing to the Licensor with a comprehensible description of the error symptoms. The notification of the defect should enable the reproduction of the error. The Licensee is encouraged to communicate any use, results, modification or publication. ### Publications using the Dataset You acknowledge that the Dataset is a valuable scientific resource and agree to appropriately reference the following paper in any publication making use of the Dataset. Citation: ``` @inproceedings{AMASS:2019, title={AMASS: Archive of Motion Capture as Surface Shapes}, author={Mahmood, Naureen and Ghorbani, Nima and F. Troje, Nikolaus and Pons-Moll, Gerard and Black, Michael J.}, booktitle = {The IEEE International Conference on Computer Vision (ICCV)}, year={2019},month = {Oct}, url = {https://amass.is.tue.mpg.de}, month_numeric = {10} } ```
    Datenpaket zitieren
    Schneider, David (2024): Muscles in Time. Computer Vision for Human Computer Interactions Lab (cv:hci), Institute for Anthropomatics and Robotics (IAR), Karlsruhe Institute of Technology. DOI: 10.35097/VDPCEFSThBWlDPFL
    • Über das Repository
    • Datenschutzerklärung
    • Nutzungsbedingungen
    • Impressum
    • Erklärung zur Barrierefreiheit
    powered by RADAR
    1.22.9 (f) / 1.16.2 (b) / 1.22.4 (i)

    RADAR4KIT ist ein über das Internet nutzbarer Dienst für die Archivierung und Publikation von Forschungsdaten aus abgeschlossenen wissenschaftlichen Studien und Projekten für Forschende des KIT. Betreiber ist das Karlsruher Institut für Technologie (KIT). RADAR4KIT setzt auf dem von FIZ Karlsruhe angebotenen Dienst RADAR auf. Die Speicherung der Daten findet ausschließlich auf IT-Infrastruktur des KIT am Steinbuch Centre for Computing (SCC) statt.

    Eine inhaltliche Bewertung und Qualitätsprüfung findet ausschließlich durch die Datengeberinnen und Datengeber statt.

    1. Das Nutzungsverhältnis zwischen Ihnen („Datennutzerin“ bzw. „Datennutzer“) und dem KIT erschöpft sich im Download von Datenpaketen oder Metadaten. Das KIT behält sich vor, die Nutzung von RADAR4KIT einzuschränken oder den Dienst ganz einzustellen.
    2. Sofern Sie sich als Datennutzerin oder als Datennutzer registrieren lassen bzw. über Shibboleth legitimieren, kann Ihnen seitens der Datengeberin oder des Datengebers Zugriff auch auf unveröffentlichte Dokumente gewährt werden.
    3. Den Schutz Ihrer persönlichen Daten erklären die Datenschutzbestimmungen.
    4. Das KIT übernimmt für Richtigkeit, Aktualität und Zuverlässigkeit der bereitgestellten Inhalte keine Gewährleistung und Haftung, außer im Fall einer zwingenden gesetzlichen Haftung.
    5. Das KIT stellt Ihnen als Datennutzerin oder als Datennutzer für das Recherchieren in RADAR4KIT und für das Herunterladen von Datenpaketen keine Kosten in Rechnung.
    6. Sie müssen die mit dem Datenpaket verbundenen Lizenzregelungen einhalten.