Search Constraints
« Previous |
1 - 10 of 16
|
Next »
Number of results to display per page
Search Results
-
- Description:
- This data collection template serves the purpose of collecting primary life cycle inventory (LCI) data for the life cycle assessment (LCA) of reuse concrete building elements from reuse practitioners and researchers conducting experimental work. The template aids in documenting building element specific energy and resource consumptions caused during the reuse process. The template was developed in the context of CRC 1683 by sub-project C02, responsible for the LCA of the building element reuse process. , About the CRC 1683: The German construction sector is responsible for a substantial resource and energy consumption, as well as high waste generation. CRC 1683 explores the possibility of a circular modular reuse of cast-in-place concrete load-bearing structures, to mitigate these environmental impacts. Divided into three project groups, the CRC researches all process steps necessary for the reuse, from the disassembly of source load-bearing structures, the characterisation of the reuse building elements properties and conditions, their refurbishment, and their connection and installation into new target structures. More information on CRC 1683 can be obtained from: https://www.sfb1683.ruhr-uni-bochum.de/mru/index.html.en, and Sub-project C02: Sub-project C02 evaluates the environmental benefits of the reuse process developed in CRC 1683 by means of Life Cycle Assessment (LCA) on the building element, building, and national building stock levels. The template presented in this dataset was created to gather the necessary Life Cycle Inventory (LCI) data from sub-projects conducting experimental works for the LCA of the reuse process on the building element level.
- Keyword:
- LCA, Life Cycle Assessment, Circular Economy, CE, Construction, and Sustainability
- Publisher:
- Language:
- English
- Date Uploaded:
- 2026-01-21
- Date Modified:
- 2026-01-23
- License:
- Creative Commons BY Attribution 4.0 International
- Resource Type:
- Dataset
-
- Description:
- The provided Bash scripts are used to convert raw magnetic resonance imaging (Magnetic Resonance Imaging, MRI) data from the Digital Imaging and Communications in Medicine (DICOM) standard format into a Brain Imaging Data Structure (BIDS)-compliant dataset. The scripts convert DICOM files into the Neuroimaging Informatics Technology Initiative (NIfTI) format, which can be used for preprocessing MRI data. In addition to the NIfTI files, the conversion also generates the metadata required by the BIDS standard. Dataset quality and compliance can be checked using a BIDS validator (e.g., https://bids-standard.github.io/bids-validator/). The two scripts are adapted for different experimental sessions: script_dcm2nii_S1.sh is used for data from the first session (S1), while script_dcm2nii_S2.sh is used for the second session (S2). The scripts differ in that S2 additionally processes diffusion-weighted imaging (DWI) data. Both scripts were developed as part of my own work with my individual MRI data. If you want to use them on your own data, you may need to adjust the scripts to match your raw data’s naming conventions. For more information, see the ReadMe file.
- Keyword:
- NIfTI (Neuroimaging Informatics Technology Initiative), conversion, Bash script, DICOM (Digital Imaging and Communications in Medicine), and BIDS (Brain Imaging Data Structure)
- Publisher:
- Language:
- English
- Date Uploaded:
- 2026-01-14
- Date Modified:
- 2026-01-20
- License:
- MIT License
- Resource Type:
- Software
-
- Description:
- This R script was developed to “translate” movement log files from a navigation task (path integration) experiment into so-called 'event tables'. These event tables summarize the recorded movements and can be used for further analyses, for example, to synchronize movement logs with brain activity. The script processes all CSV log files in a specified folder, classifies movement types, groups consecutive events, calculates onset, duration, and angles, and generates both individual and combined master event tables. Sample data are provided to test the script, though path adjustments are required. For more information, see the ReadMe file.
- Keyword:
- RStudio script, Movement log, and Event Table
- Publisher:
- Language:
- English
- Date Uploaded:
- 2026-01-14
- Date Modified:
- 2026-01-19
- License:
- MIT License
- Resource Type:
- Software
-
- Description:
- Generates configurable path integration (PI) patterns for Virtual Reality (VR) navigation experiments with RStudio. Creates 4-point configurations (Spawn→Start→Distractor→Goal→back to Start) optimized for maximum path length while respecting geometric constraints. The script was developed out of the need for rapid randomized generation of path integration patterns that can be easily adapted to the respective experimental conditions. In addition to the actual RStudio script, the package also contains an accompanying ReadMe file, sample input data, and the corresponding output files.
- Keyword:
- Virtual Experiments, Path Integration, RStudio script, and Virtual Environments
- Publisher:
- Language:
- English
- Date Uploaded:
- 2026-01-14
- Date Modified:
- 2026-01-19
- License:
- MIT License
- Resource Type:
- Dataset
-
- Description:
- Visualizes participant navigation trajectories from Virtual Reality (VR) path integration (PI) experiments, showing target locations, traversed paths, orientation arrows, and drop error distances in RStudio. The script was developed to check whether participants performed the task correctly or whether other problems occurred during a trial (e.g., path integration marker was not deactivated correctly). Such trials were successfully excluded from further data analysis using the visualization script. In addition to the actual RStudio script, the package also includes an accompanying ReadMe file, sample input data, and the corresponding output visualizations.
- Keyword:
- RStudio script, path visualization, and path integration
- Publisher:
- Language:
- English
- Date Uploaded:
- 2026-01-14
- Date Modified:
- 2026-01-19
- License:
- MIT License
- Resource Type:
- Dataset
-
Traversed Path Visualization
User Collection
- Description:
- Visualizes participant navigation trajectories from virtual VR path integration experiments, showing target locations, traversed paths, orientation arrows, and drop error distances in RStudio.
- Keyword:
- RStudio script, path visualization, and path integration
- Language:
- English
- Date Created:
- 14.01.2026
- Resource Type:
- Software
0Collections1Works -
- Description:
- The dataset contains continuous waveforms from a two-week Distributed Acoustic Sensing (DAS) experiment and the resulting catalog(s). The DAS data were recorded along a ~15 km-long dark telecom optical fiber connecting the villages of Antipata (Kefalonia) and Stavros (Ithaki), Greece. Continuous waveforms cover the period from 1 August 2024, 23:00 UTC to 15 August 2024, 23:00 UTC, with a short data gap between 09:14 and 09:48 UTC on 6 August 2024 due to an archiving interruption. File names correspond to UTC time + 1 hour, following the default system convention. Data acquisition was performed using a QuantX OptaSense interrogator with a ping rate of 5000 Hz, and subsequently decimated by a factor of 20 to a final sampling rate of 250 Hz. The system configuration used a gauge length of 10 m and a channel spacing of 2.04 m, resulting in a total of 7,750 channels. Waveform data are stored in 30-second segments and in the default instrument unit (optical phase shift). Additionally, we include the channel list after removing loops. and The DOWNLOAD ALL button provides a bash script that downloads all files in this dataset using a sequence of 'curl' commands. You may edit the script to suit your needs—for example, by removing or commenting out 'curl' lines corresponding to files or time intervals you do not wish to download.
- Keyword:
- Distributed Acoustic Sensing (DAS), Distributed Fiber Optic Sensing (DFOS), Seismicity, and Earthquake Catalog
- Subject:
- Earthquake and Seismology
- Publisher:
- Language:
- English
- Date Uploaded:
- 2025-12-04
- Date Modified:
- 2025-12-18
- License:
- Creative Commons CC0 1.0 Universal
- Resource Type:
- Dataset
-
- Description:
- This dataset is part of a Distributed Fiber Optic Strain Sensing (DFOS) study along a dark fiber for monitoring purposes conducted on an optical fiber link between Boxmeer and Deurne in the Netherlands. The work investigates the application of DFOS using existing dark fiber infrastructure from telecommunication companies to capture strain variations along tens of kilometers with high spatial and temporal resolution. The experiment covered the time period from 2025-06-25 08:19:32 UTC to 2025-08-11 11:56:03 UTC. This data set contains the recordings covering the arrival of seismic waves from the 2025 M8.8 Kamchatka earthquake between 2025-07-29 23:30:01 UTC to 2025-07-30 01:15:46 UTC. Recordings with the QuantX-system contain the following settings: gauge length of 20 m, channel spacing of 2 m, and a sampling rate of 250 Hz. The dark fiber has a length of nearly 36 km, leading to a total of 18,100 channels in each file. File naming follows the start timestamp in BTS. The 26 geo-referenced channel locations from tap tests and/or locatable noise sources are contained in this data set as well.
- Keyword:
- Distributed Fiber Optic Sensing (DFOS), Seismic Monitoring, Kamchatka, Earthquake, Ground Motion, Netherlands, Distributed Acoustic Sensing (DAS), and Distributed Dynamic Strain Sensing (DDSS)
- Subject:
- Earthquake and Seismology
- Publisher:
- Language:
- English
- Date Uploaded:
- 2025-09-12
- Date Modified:
- 2025-09-29
- License:
- Creative Commons CC0 1.0 Universal
- Resource Type:
- Dataset
-
- Description:
- This dataset is part of the publication "A PLM-Integrated automated synthetic image generation pipeline for object detection" presented at the PLM conference 2025 in Sevilla. The paper presents an automated synthetic image data generation pipeline aimed at streamlining the training process of object detection models supporting manual assembly processes. By automating the rendering of images from CAD models instead of relying on manually created physical product images, the pipeline enables dataset creation in earlier phases of the product lifecycle while also significantly reducing manual effort. This approach enhances the accessibility for finetuned object detection model development. The pipeline integrates two core components: object similarity analysis and synthetic image generation. The similarity analysis groups visually similar objects into unified classes for the object detection model, reducing confusion during detection. The image generation process can be augmented with contextual information from virtual 3D workplace scenes, thereby significantly mitigating the sim-to-real gap. The pipeline is accessed via a REST API, enabling seamless integration with PLM systems for automated retrieval of CAD models and workplace scene data. A workflow manager orchestrates interactions between the user, the PLM system, and the generation pipeline. The effectiveness of the system is validated by evaluating object detection models trained on the synthetically generated datasets against real-world images, demonstrating its potential to improve detection accuracy and robustness in industrial environments. By publishing the data used, we aim to strengthen the traceability of the results obtained and that we can encourage further research in this field.
- Keyword:
- object detection, industrial object detection, synthetic data, plm, and product lifecycle management
- Publisher:
- Language:
- English and German
- Date Uploaded:
- 2025-06-03
- Date Modified:
- 2025-06-05
- License:
- Creative Commons BY Attribution 4.0 International
- Resource Type:
- Dataset
-
- Description:
- Simulation tasks and numerical codes have become increasingly complex in the context of temporal and spatial upscaling for understanding and predicting subsurface rock mass mechanics, for example associated with radioactive waste disposal. Benchmark procedures based on laboratory datasets are required to verify the predictive capabilities of computational approaches. In this study, a high-quality laboratory dataset of conventional geomechanical experiments on granite was generated. Numerical simulations of the experiments were performed as proof-of-concept using COMSOL Multiphysics and RocScience RS to derive a benchmark procedure for numerical quality assurance that can be applied to computational approaches in the context of nuclear waste disposal and beyond. The laboratory schedule was specifically developed for the numerical simulation of time-dependent deformation characteristics in granite. Splitting tensile strength, uniaxial compressive strength, and triaxial compressive strength of carefully characterised specimens were investigated at different strain rates covering five orders of magnitude. Strength decreased with decreasing strain rate, and the presence of water decreased strength significantly. A significant contribution of end face friction in the experimental setup to the results of strength tests was verified and recommendations for preparational and experimental procedures in deformation experiments on granite were derived. Based on the laboratory dataset, 2D numerical simulations with RocScience RS2 successfully reproduced the effect of different lubricants to modify end face friction on strength, and COMSOL Multiphysics was able to reproduce the time-dependent deformation characteristics observed for granite. Using crack phase field damage modelling, COMSOL Multiphysics predicted triaxial compressive strength from uniaxial compressive strength by adjusting nothing but the boundary conditions. In both approaches, the adaptation of microstructural properties was required to successfully simulate the experimental findings pointing to a distinct need to further improve the understanding of microstructural processes causing the time-dependent deformation characteristics and to evaluate the potential for temporal upscaling to long-term processes exceeding those covered by laboratory experiments. The results of this study will significantly contribute to gaining more confidence in the predictive capabilities of numerical codes and identify code-specific parameters that are critical for successful prediction. and When referencing this report, please use the following citation: Witte L.C., Asghari Chehreh H., Backers T., Duda M., Aydin M. & Parvin S., 2024. Predictive capability of coupled rock behaviour – development of an experimentally based benchmark for numerical quality assurance (BeNuQuA). ReSeeD Research Data Repository, Ruhr University Bochum.
- Keyword:
- rock deformation, laboratory experiments, granite, Rocscience, nuclear waste, rocks, time-dependence, code comparison, benchmark, numerical simulation, and Comsol Multiphysics
- Subject:
- granite, Comsol Multiphysics, numerical simulation, laboratory experiments, rocks, nuclear waste, benchmark, Rocscience, time-dependence, rock deformation, and code comparison
- Publisher:
- Language:
- English
- Date Uploaded:
- 2025-06-02
- Date Modified:
- 2025-06-05
- License:
- Creative Commons BY-SA Attribution-ShareAlike 4.0 International
- Resource Type:
- Report
