伤员转运后送
01-从角色2向角色3医疗设施航空医疗后送期间的战斗伤亡管理
03-Collective aeromedical evacuations of SARS-CoV-2-related ARDS patients in a military tactical plane- a retrospective descriptive study
04-乌克兰火车医疗后送的特点,2022
02-Decision Support System Proposal for Medical Evacuations in Military Operations
02-军事行动中医疗后送的决策支持系统建议
05-无人驾驶飞机系统的伤员疏散需要做什么
04-Characteristics of Medical Evacuation by Train in Ukraine, 2022.
05-Unmanned Aircraft Systems for Casualty Evacuation What Needs to be Done
07-一个德语语料库,用于搜索和救援领域的语音识别
08-雷达人类呼吸数据集的应用环境辅助生活和搜索和救援行动
08-Radar human breathing dataset for applications of ambient assisted living and search and rescue operations
06-基于信息融合的海上搜索救援目标定位
07-RESCUESPEECH- A GERMAN CORPUS FOR SPEECH RECOGNITION IN SEARCH AND RESCUE DOMAIN
12-欧盟和世卫组织联手进一步加强乌克兰的医疗后送行动
09-战场伏击场景下无人潜航器最优搜索路径规划
11-麦斯卡尔医疗后送-康涅狄格州陆军警卫医务人员在大规模伤亡训练中证明了他们的能力
06-Target localization using information fusion in WSNs-based Marine search and rescue
13- 年乌克兰火车医疗后送的特点
09-Optimal search path planning of UUV in battlefeld ambush scene
10-志愿医护人员从乌克兰前线疏散受伤士兵
14-海上搜救资源配置的多目标优化方法——在南海的应用
14-A Multi-Objective Optimization Method for Maritime Search and Rescue Resource Allocation An Application to the South China Sea
15-基于YOLOv5和分层人权优先的高效无人机搜索路径规划方法
17-乌克兰医疗保健专业人员在火药行动期间的经验对增加和加强培训伙伴关系的影响
17-Ukrainian Healthcare Professionals Experiences During Operation Gunpowder Implications for Increasing and Enhancing Training Partnerships
15-An Integrated YOLOv5 and Hierarchical Human Weight-First Path Planning Approach for Efficient UAV Searching Systems
16-基于旋转变压器的YOLOv5s海上遇险目标检测方法
16-YOLOv5s maritime distress target detection method based on swin transformer
19-人工智能的使用在伤员撤离、诊断和治疗阶段在乌克兰战争中
19-THE USE OF ARTIFICIAL INTELLIGENCE AT THE STAGES OF EVACUATION, DIAGNOSIS AND TREATMENT OF WOUNDED SOLDIERS IN THE WAR IN UKRAINE
18-军事行动中医疗后送的决策支持系统建议
20-乌克兰医疗保健专业人员在火药行动中的经验对增加和加强培训伙伴关系的影响
20-Ukrainian Healthcare Professionals Experiences During Operation Gunpowder Implications for Increasing and Enhancing Training Partnerships
21-大国冲突中医疗后送的人工智能
18-Decision Support System Proposal for Medical Evacuations in Military Operations
23-伤亡运输和 疏散
24-某军用伤员疏散系统仿真分析
23-CASUALTY TRANSPORT AND EVACUATION
24-Simulation Analysis of a Military Casualty Evacuation System
25-无人驾驶飞机系统的伤员疏散需要做什么
26-Aeromedical Evacuation, the Expeditionary Medicine Learning Curve, and the Peacetime Effect.
26-航空医疗后送,远征医学学习曲线,和平时期的影响
25-Unmanned Aircraft Systems for Casualty Evacuation What Needs to be Done
28-军用战术飞机上sars - cov -2相关ARDS患者的集体航空医疗后送——一项回顾性描述性研究
27-乌克兰火车医疗后送的特点,2022
27-Characteristics of Medical Evacuation by Train in Ukraine, 2022.
28-Collective aeromedical evacuations of SARS-CoV-2-related ARDS patients in a military tactical plane- a retrospective descriptive study
03-军用战术飞机上sars - cov -2相关ARDS患者的集体航空医疗后送——一项回顾性描述性研究
30-评估局部现成疗法以减少撤离战场受伤战士的需要
31-紧急情况下重伤人员的医疗后送——俄罗斯EMERCOM的经验和发展方向
31-Medical Evacuation of Seriously Injured in Emergency Situations- Experience of EMERCOM of Russia and Directions of Development
30-Evaluation of Topical Off-the-Shelf Therapies to Reduce the Need to Evacuate Battlefield-Injured Warfighters
29-军事行动中医疗后送的决策支持系统建议
29-Decision Support System Proposal for Medical Evacuations in Military Operations
32-决策支持在搜救中的应用——系统文献综述
32-The Syrian civil war- Timeline and statistics
35-印尼国民军准备派飞机接运 1
33-eAppendix 1. Information leaflet basic medical evacuation train MSF – Version April 2022
36-战场上的医疗兵
34-Characteristics of Medical Evacuation by Train in Ukraine
22-空军加速变革以挽救生命:20年来航空医疗后送任务如何取得进展
34-2022年乌克兰火车医疗疏散的特点
33-信息传单基本医疗后送车
40-航空医疗后送
43-美军的黄金一小时能持续多久
42-陆军联手直升机、船只和人工智能进行伤员后送
47-受伤的士兵撤离
46-伤员后送的历史从马车到直升机
37-从死亡到生命之路
41-后送医院
52-印度军队伤员航空医疗后送经验
53-“地狱之旅”:受伤的乌克兰士兵撤离
45-伤病士兵的撤离链
54-热情的和资源匮乏的士兵只能靠自己
57-2022 年乌克兰火车医疗后送
51-医务人员在激烈的战斗中撤离受伤的乌克兰士兵
59-乌克兰展示医疗后送列车
61-俄罗斯士兵在乌克兰部署自制UGV进行医疗后送
60-“流动重症监护室”:与乌克兰顿巴斯战斗医务人员共24小时
50-医疗后送——保证伤员生命安全
阿拉斯加空军国民警卫队医疗后送受伤陆军伞兵
航空撤离,印度经验 抽象的
通过随机森林模拟规划方法解决军事医疗后送问题
2022 年乌克兰火车医疗后送的特点
战术战地救护教员指南 3E 伤员后送准备和要点 INSTRUCTOR GUIDE FOR TACTICAL FIELD CARE 3E PREAPRING FOR CASUALTY EVACUTION AND KEY POINTS
军事医疗疏散
北极和极端寒冷环境中的伤亡疏散:战术战斗伤亡护理中创伤性低温管理的范式转变
-外地伤员后送现场伤亡疏散
伤员后送图片
从角色2到角色3医疗设施期间战斗人员伤亡管理
关于军事行动中医疗疏散的决策支持系统建议书
在军事战术平面上对sars-cov-2相关 ARDS患者进行的集体空中医疗后送: 回顾性描述性研究
2022年乌克兰火车医疗疏散的特点
透过战争形势演变看外军营救后送阶梯 及医疗救护保障措施
东部伤兵营 英文 _Wounded_Warrior_Battalion_East
组织紧急医疗咨询和医疗后送 2015 俄文
-
+
首页
08-Radar human breathing dataset for applications of ambient assisted living and search and rescue operations
<p><a href="https://doi.org/10.1016/j.dib.2023.109757">Data in Brief 51 (2023) 109757</a></p><p><img src="/media/202408//1724838577.972758.jpeg" /><img src="/media/202408//1724838577.9811258.jpeg" /></p><p>Contents lists available at <a href="http://www.ScienceDirect.com/science/journal/23523409">ScienceDirect</a></p><p>Data in Brief</p><p>journal homepage: <a href="http://www.elsevier.com/locate/dib">www.elsevier.com/locate/dib</a></p><p>Data Article</p><p>Radar human breathing dataset for</p><img src="/media/202408//1724838577.986422.jpeg" /><p><img src="/media/202408//1724838577.990556.png" /></p><p><a href="http://crossmark.crossref.org/dialog/?doi=10.1016/j.dib.2023.109757&domain=pdf">check for</a></p><p><a href="http://crossmark.crossref.org/dialog/?doi=10.1016/j.dib.2023.109757&domain=pdf">updates</a></p><p>applications of ambient assisted living and <a id="bookmark1"></a>search and rescue operations</p><p>Cansu Eren<a href="#bookmark1">a,*,</a> Saeid Karamzadeh<a href="#bookmark1">b,</a> Mesut Kartal<a href="#bookmark2">c</a></p><p>a <em>Satellite Communication and Remote Sensing, Department of Communication Systems, Informatics Institute, Istanbul Technical University, Istanbul, Türkiye</em></p><p>b <em>Millimeter Wave Technologies, Intelligent Wireless System, Silicon Austria Labs (SAL), 4040 Linz, Austria. Electrical and Electronics Engineering Department, Faculty of Engineering and Natural Sciences, Bahçes¸ehir University, 34349 </em><a id="bookmark2"></a><em>Istanbul, Türkiye</em></p><p>c <em>Department of Electronics and Communication Engineering, Istanbul Technical University, Istanbul, Türkiye</em></p><p>a r t i c l e i n f o</p><p><em>Article history:</em></p><p>Received 3 July 2023</p><p>Revised 9 October 2023</p><p>Accepted 30 October 2023</p><p>Available online 10 November 2023</p><p><img src="/media/202408//1724838578.0137448.png" /><a href="https://data.mendeley.com/datasets/cbj37wdsdj">Dataset link: Radar Human Breathing Dataset for Applications of Ambient</a></p><p><a href="https://data.mendeley.com/datasets/cbj37wdsdj">Assisted Living and Search and Rescue Operations (Original data)</a></p><p><em>Keywords:</em></p><p>Ultrawideband (UWB) Vital signs</p><p>Ambient assisted living, radar Monitoring</p><p>a b s t r a c t</p><p>This dataset consists of signatures of human vital signs that are recorded by ultrawideband radar and lidar sensors. The data acquisition scene considers the human posture mod- els(supine/lateral/facedown), different radar antenna angles towards the human, various set of distances and operational radar characteristics (bandwidth selection/mean power). The raw data files of lidar&radar and processed data files are pre- sented separately in the data repository. The lidar sensor is chosen as a reference sensor. There are 432 data records, and each data scene’s trial number is eight. There is a ho- mogeneous wooden table to mimic clutter while forming a dataset. Thus, this dataset covers applications of search and rescue operations, sleep monitoring, and ambient assisted living (AAL) applications.</p><p>© 2023 The Author(s). Published by Elsevier Inc.</p><p>This is an open access article under the CC BY license <a href="http://creativecommons.org/licenses/by/4.0/">(http://creativecommons.org/licenses/by/4.0/)</a></p><p><a id="bookmark3"></a>* Corresponding author.</p><p><em>E-mail address: </em><a href="mailto:buyukhan@itu.edu.tr">buyukhan@itu.edu.tr</a> (C. Eren).</p><p><a href="https://doi.org/10.1016/j.dib.2023.109757">https://doi.org/10.1016/j.dib.2023.109757</a></p><p>2352-3409/© 2023 The Author(s). Published by Elsevier Inc. This is an open access article under the CC BY license <a href="http://creativecommons.org/licenses/by/4.0/">(http://creativecommons.org/licenses/by/4.0/)</a></p><p>2 <em>C. Eren, S. Karamzadeh and M. Kartal/Data in Brief 51 (2023) 109757</em></p><p>Specifications Table</p><table><tr><td><p>Subject</p><p>Specific subject area</p></td><td><p>Electrical and Electronic Engineering, Signal Processing, Data Science</p><p>Non-contact monitoring of human vital signs; through wall UWB radar; ambient-assisted living (AAL); artificial intelligence& machine learning</p></td></tr><tr><td><p>Type of data</p></td><td><p>Radar Data (Calibration, Processed, Raw Radar, ErrorAnalysis) and Reference Data (Lidar) are given in “ . mat" format. The codes are provided as “ . m." A brief explanation for the</p></td></tr><tr><td><p>How the data were acquired</p></td><td><p>data scene “.pdf,”, and the licenses “.txt” for coding are also included.</p><p>The proposed dataset is formed by FlatEarth’s UWB Radar Solutions: Salsa Ancho radar</p><p>module and ST life. augmented: VL53L0X laser-ranging sensor (Lidar). XETHRU X2-Impulse Radar Transceiver by NOVELDA, transmitting and receiving antennas are placed in</p><p>BeagleBone Black cape, which is the element of the Salsa Ancho radar module that enables the data transmission. The data recording and the communication interference between</p></td></tr><tr><td><p>Data format</p><p>Description of data collection</p></td><td><p>the Salsa Ancho radar module and computer is established using SalsaLab MATLAB</p><p>Toolbox. The data recording and communication interference between Lidar sensor and computer is established using Arduino IDE and MATLAB.</p><p>Raw (Radar, Calibration, Reference), Processed Data and Error Calculation</p><p>The proposed data is recorded at 1.75 GHz,2.5 GHz, and 3.1 GHz bandwidths. The</p><p>threshold signal value of the bandwidth is at −10 dB. The clutter is chosen as a 3-cm thick wooden table. The UWB radar sensor is placed above the wooden table and this distance</p><p>is varied between at 8 cm to 27 cm, and the lidar sensor is located under the wooden</p><p>table. Supine, lateral, and face-down human orientations are regarded through the UWB</p><p>radar and lidar device. The trial number of data recordings for each data scenario was 8.</p><p>The duration of each data recording is 58.6 s. The radar orientations are chosen at angles where 0。and 30。towards the human chest. The distances between radar and lidar sensors</p></td></tr><tr><td><p>Data source location</p></td><td><p>and human is varied between 46 cm to 78 cm. The human is 32 years old, and the</p><p>height&weight of the human is 173 cm, 59 kg. The size of the human chest is 30cmx23cm. While data collection, the human is lying steadily on the floor. While data recording, the</p><p>lidar and radar sensors are placed so that the observation of the human chest is clear. The reference and radar data recordings are realized simultaneously. After raw data collection, the processed data is formed to extract breath frequencies. The calibration data is acquired where the human subject is absent in the data scene.</p><p>• City: Maltepe/Istanbul</p><p>• Country: Turkey</p></td></tr><tr><td><p>Data accessibility</p></td><td><p>Repository name: Mendeley Data</p><p>Data identification number: <a href="https://doi.org/10.17632/cbj37wdsdj.2">10.17632/cbj37wdsdj.2</a> Direct link to Data</p></td></tr></table><p><strong>1. Value of the Data</strong></p><p>• This dataset allows researchers to analyze and develop applications for non-contact monitor- ing of vital signs, ambient assistant living (AAL) and search rescue operations after disasters.</p><p>• This dataset provides a brief data source for artificial intelligence applications that automatize and enhance vital human sign detection in various fields.</p><p>• This dataset contributes literature to examine near antenna field regions and cluttered envi- ronments.</p><p>• Enabling usage of the selected set of primary parameters during data acquisition, that is, the type of human posture, operational radar characteristics, the cluttered environment, different sets of ranges and radar positions, the dataset provides better understanding and visualiza-</p><p>tion to researchers.</p><p><strong>2. Objective</strong></p><p>The developments in ultrawideband (UWB) radar technology validate the capability of de- tection&monitoring of vital human signs via radar techniques. The proposed dataset aims to provide detailed human vital records regarding human posture orientation, selection of opera- tional radar characteristics, radar positioning through human body, different set of ranges and near-field clutter environment.</p><p><em>C. Eren, S. Karamzadeh and M. Kartal/Data in Brief 51 (2023) 109757 </em>3</p><p><strong>3. Data Description</strong></p><p>Contactless monitoring of human vital signs is an appealing technology in medicine, de- fense, and search and rescue operations <a href="#bookmark4">[1].</a> The growing rates of the aging population in the world and the past COVID pandemic disclose the need for remote devices that can ex- amine the physical activity of human beings at their houses, diagnose illnesses, and enhance the life quality of older adults <a href="#bookmark5">[2,</a><a href="#bookmark6">3].</a> Blood pressure, body temperature, breath, and heart rate are the principal vital signs of human-beings <a href="#bookmark7">[4].</a> The breath rate indicates the vividness of the human body and causes more enormous distance changes in the human chest compared to the heartbeat motion <a href="#bookmark8">[5].</a> Thus, the detection of the breath is crucial for vital sign mon- itoring. The breath rate is detectable using contact-based and remote sensors such as elec- trocardiography, strain-based sensors, photoplethysmography, and radar systems <a href="#bookmark7">[4].</a> The stan- dard breath frequencies of human beings are 0.2–0.33 Hz <a href="#bookmark9">[6].</a> Gender, age, weight, abnormal- ities in lung expansion and contraction while breathing, and exercise affect the breath rates of humans. Thus, breath rate estimation is applied in rigorous biomedical applications such as anomaly detection(obstructive sleep apnea, sudden infant death syndrome) <a href="#bookmark5">[2 8],7,.</a> The other main research fields of human vital sign monitoring are the physical examination of athletes during exercise search and rescue of humans after natural disasters such as earthquakes and avalanches <a href="#bookmark10">[9].</a></p><p>Contact-based sensors require attachment to the human body to monitor breath rates that cause skin irritations for more extended observations. Contact-based sensors are prone to mo- tion artifacts and can not be used on injured tissue. Thus, such sensor selection reduces patient comfort in hospitals and houses. Moreover, these systems are invalid in search operations after natural disasters <a href="#bookmark11">[10].</a></p><p>UWB radar systems offer a high-range resolution, non-ionized, contactless, and rapid data ac- quisition <a href="#bookmark12">[11]</a>.Regarding their higher bandwidth solutions, tiny movements of the human chest while breathing are sensible by UWB radar systems <a href="#bookmark13">[12].</a>. The other radar systems can re- motely detect human vital signs, such as continuous wave radars, frequency-modulated con- tinuous wave radars, non-linear radars, and stepped-frequency continuous-wave (SFCW) Radars <a href="#bookmark14">[13,</a><a href="#bookmark15">14].</a> Machine learning algorithms promote the accuracy of human vital signs, such as breath rate in various fields, making radar systems a great candidate for monitoring human vital signs in varios data scenarios. The challenges of breath rate detection via radar systems are the human orientation towards the radar sensors, cluttered environment, random body move- ment (RBM), the orientation of radar systems, movement of humans while data collection, and multiple human-beings in the data records <a href="#bookmark16">[15,</a><a href="#bookmark17">16].</a> The selection of the operational band- width/center frequency and the transmitted power can enhance the accuracy of the breath rate estimation via radar systems. However, there still needs to be more public shared radar vi- tal sign datasets to test the ability of the current machine learning algorithms. In this pro- posed dataset, we take into consideration the human postures, different sets of antenna posi- tions, the various settings of ranges, and operational radar characteristics such as bandwidth selection.</p><p>The file structure of the proposed data is given in <a href="#bookmark18">Fig. 1.</a> The data file consists of sub data files “Calibration Data,” “Processed Data,” “Raw Radar Data,”, “ErrorCalculation” and “Reference Data” . There is additional documentation for the data, such as licenses for coding files&database, a brief explanation of data as “AboutData.pdf” Each data file is consisted of sub-files such as bandwidth1, bandwidth2, and bandwidth3, except “Calibration Data” . Each sub-file covers two different sets of angle orientation of radar systems, three different types of human posture, and the six different sets of range differences for “Processed Data,” “Raw Radar Data,” and “Reference Data.” The “ErrorCalculation” data file is the place for where the relative absolute error calcula- tions are made for each bandwidth selection. “Calibration Data” is the experiments performed where the human is absent. After applying the techniques given in “Section: Experimental de- sign, materials and methods”, the data is recorded in “Processed Data” .</p><p>Processed data consists of the background-subtracted data (“BS_…”), estimated breath radar (“FilteredBreathRadar_…”), reference signals (“FilteredBreath_Ref_… ”), the spectrum of</p><p>4 <em>C. Eren, S. Karamzadeh and M. Kartal/Data in Brief 51 (2023) 109757</em></p><p><img src="/media/202408//1724838578.063953.jpeg" /></p><p><strong>Fig. 1. </strong>The proposed data file structure.</p><p><a id="bookmark18"></a>breath&reference signals (“SpectrumRadar_…”, “SpectrumRef_…”), estimated radar and refer- ence breath frequencies matrixes (“EstimatedBreathFrequencyRadar.mat”, “EstimatedBreathFre- quencyRef.mat”). Moreover, signal-to-noise ratio calculations of UWB radar data and reference data are included in “Processed Data” as “RefSignalToNoise.mat” and “RadarSignalToNoise.mat” . The histogram presentation of the absolute relative error calculations and error presentations are given as “Error#.fig” and “Histogram#.fig.” Human posture in the data naming refers to the subject orientation for supine/lateral/facedown. The operational radar bandwidth name indices are “ …Band# (1,2,3)” …DeltaR=#(27 cm, 22.5 cm, 20 cm, 15.5 cm, 10 cm and 8 cm) … “refers to the distance between UWB radar and the wooden table.” …Angle=#(30。and 0。…) “presents the UWB radar’s orientation through the human chest. The repeat number of the data recording is given by “ …Trial#” .</p><p>The local error data of each data record scene and the error data of bandwidth selections are given in the “ErrorCalculation” data file as “ErrorBandwidth#.mat” and “LocalErrorTable#.mat.”</p><p><em>C. Eren, S. Karamzadeh and M. Kartal/Data in Brief 51 (2023) 109757 </em>5</p><p><img src="/media/202408//1724838578.075624.jpeg" /></p><p><strong>Fig. 2. </strong>The UWB radar data record samples that regard bandwidth selection and human posture.</p><p>We also include the “DataTableBandwidth#.mat” that is consisted of estimated breath frequen- <a id="bookmark19"></a>cies and signal-to-noise ratios.</p><p><a href="#bookmark19">Fig. 2</a> shows the bScan radar data records of different set bandwidth selection and human postures. The x-axis shows the time duration for the data collection, which is slow time, and the y-axis refers to the range in meters. <a href="#bookmark20">Fig. 3</a> presents the data records where the human is located at different ranges and the UWB radar is positioned at 0。and 30。. The time duration for <a id="bookmark20"></a>each data is 58.6 s and the unambiguous range is up to 1 m.</p><p><img src="/media/202408//1724838578.0814762.jpeg" /></p><p><strong>Fig. 3. </strong>The UWB radar data record samples that regard different ranges and antenna orientations.</p><p>The histogram and absolute relative error of data that considers bandwidth selections are given in <a href="#bookmark21">Fig. 4.</a> The absolute relative error is calculated based on the formula in <a href="#bookmark22">(1.1)</a> where fradar is the estimated breath frequency and freference is the estimated breath frequency of lidar sensor. N represents the number of sets of data trials. The alpha corresponds to signal to ratio values of each bandwidth selection on histogram data in <a href="#bookmark21">Fig. 4.</a></p><p><a id="bookmark22"></a><strong>ε </strong>= 100<strong>x </strong><img src="/media/202408//1724838578.095513.png" /><strong> </strong>(1.1)</p><p>6 <em>C. Eren, S. Karamzadeh and M. Kartal/Data in Brief 51 (2023) 109757</em></p><p><img src="/media/202408//1724838578.100183.jpeg" /></p><p><strong>Fig. 4. </strong>The density representation of detected breath frequencies and the absolute relative error rates of UWB breath data.</p><p><a id="bookmark21"></a><strong>4. Experimental Design, Materials and Methods</strong></p><p>This section explains the experimental design of data collection and the methodology used to extract “Processed Data” .</p><p><em>4.1. Experimental design</em></p><p>The model for the TWR data collection and its types of equipment is shown in <a href="#bookmark23">Fig. 5</a>(a)- (b). Salsa Ancho Radar Module and STlife.augmented range and gesture sensor are used during data collection <a href="#bookmark24">[17,</a><a href="#bookmark25">18].</a> Both systems are applicable to biomedical applications and provide safety considerations for human experiments. The proposed radar system operates on 4.5GHz- 9.5 GHz. The bandwidth of the radar system is adjusted as 1.75 GHz,2.5 GHz and 3.1 GHz at −10 dB</p><p><em>C. Eren, S. Karamzadeh and M. Kartal/Data in Brief 51 (2023) 109757 </em>7</p><p><img src="/media/202408//1724838578.1194382.jpeg" /></p><p><strong>Fig. 5. </strong>Experimental design. (a) Depiction of experimental scene regarding selection of radar operational parameters, antenna and subject orientations. (b) The experimental setup was used in dataset acquisition (c) Human body postures<em>.</em></p><p><a id="bookmark23"></a>using graphical interference of the equipment. The center frequencies are 5.3 GHz,7.7 GHz and 8.8 GHz. The mean power at that region are −10.7 dBm, −14 dBm, and −16.4 dBm, respectively. The range accuracy of the radar is 4 mm and up to 1.2 mm for the lidar sensor. The dimensions of the radar system and lidar sensors are 58.42×54.61 mm and 4.4 × 2.4 × 1 mm. The antenna dimensions of radar sensor are 4.2cmx6.5 cm. The radar geometry is monostatic. MATLAB 2020A and Arduino IDE are used during the acquisition of radar and lidar data.</p><p>A homogeneous 3-cm thick wooden table is positioned between the human and UWB radar sensor. The distance between the radar and the table was adjustable and varied between 8 cm to 27 cm. The range between human and lidar was 48 cm, and the lidar was placed under the wooden table where it directly saw the human chest.</p><p>The selection of clutter type is made due to simple and well-known characteristics of the wood. The distances between human and radar systems vary depending on the subject’s posture. The human was lying down steadily for 58.57 s. The data recordings were repeated eight times in each scenario. The human subject orientations were organized as supine/lateral/facedown. The range resolution of the UWB radar were 8.6 cm, 6 cm and 4.8 cm.</p><p>The radar system was positioned at two different angles during data recording. The data record parameters are given in <a href="#bookmark26">Table 1.</a> The corresponding posture definitions and their range <a id="bookmark26"></a>between human-radar/human-lidar is shown in <a href="#bookmark23">Fig. 5</a>(c).</p><p><strong>Table 1</strong></p><p>The parameters of data record and scenarios.</p><table><tr><td><p>Human</p></td><td><p>The Radar</p></td><td><p>The Radar</p></td><td><p>The Radar</p></td><td><p>Distance-</p></td><td><p>Angle</p></td><td><p>Distance</p></td><td><p>Number</p></td></tr><tr><td><p>Body</p></td><td><p>Center</p></td><td><p>Mean</p></td><td><p>Bandwidth</p></td><td><p>Between</p></td><td><p>Variations</p></td><td><p>Between</p></td><td><p>of Trials</p></td></tr><tr><td><p>Position</p></td><td><p>Frequency</p></td><td><p>Power</p></td><td></td><td><p>Target-Radar</p></td><td><p>of Radar</p></td><td><p>Target-Lidar</p></td><td></td></tr><tr><td><p>Face-Down,</p></td><td><p>5.3 GHz,</p></td><td><p>−10.7 dBm,</p></td><td><p>1.75 GHz,</p></td><td><p>59cm-78cm</p></td><td><p>00,</p></td><td><p>48cm</p></td><td><p>8</p></td></tr><tr><td><p>Supine,</p></td><td><p>7.7 GHz</p></td><td><p>−14 dBm</p></td><td><p>2.5 GHz</p></td><td></td><td><p>300</p></td><td></td><td></td></tr><tr><td><p>Lateral</p></td><td><p>8.8GHz</p></td><td><p>−16.4dBm</p></td><td><p>and 3.1GHz</p></td><td></td><td></td><td></td><td></td></tr></table><p>8 <em>C. Eren, S. Karamzadeh and M. Kartal/Data in Brief 51 (2023) 109757</em></p><p>The radar and lidar sensors work together simultaneously. Fundamentally, the proposed radar system emits higher-order Gaussian modulated signals through the data scene using transmitter antenna. The emitted radio waves are reflected from the table, human chest, and surrounding objects. Each object is resulted a time delay in echo signal and collected by the receiver. After the sampling process, the radar data is formed. Unlike radar system, the lidar transmitter sends the light waves within 940 nm wavelength through the scene. The emitted light waves bounce back from the scene and collected by receiver. The lidar sensor records the ranging value of the targets. Regarding periodic movement of human chest while breathing, the data ranging values are varied. Thus, the recorded radar and lidar ranging values are closer while the lungs are inflated and far away at the deflated phase of the lungs.</p><p><em>4.2. Methods</em></p><p>The schematic of the data recording scene has already given in <a href="#bookmark23">Fig. 5.</a> This section explains the mathematical description of the scene and the methodology that used while forming “Pro- cessed Data” file. The nominal distance (d0 ) between human subject and UWB radar system is <a id="bookmark27"></a>given in <a href="#bookmark27">(1.2).</a></p><p>d0 = ⅡXh − Xtx Ⅱ + ⅡXrx − Xh Ⅱ (1.2)</p><p><img src="/media/202408//1724838578.139105.jpeg" /></p><p><strong>Fig. 6. </strong>The flowchart of the proposed dataset, DeltaR=10cm_Angle=0_Band1_Trial1.</p><p><em>C. Eren, S. Karamzadeh and M. Kartal/Data in Brief 51 (2023) 109757 </em>9</p><p>where the positions of human, transmitter and receiver antennas are given as Xh , Xtx , and Xrx. Time-varying distance due to the movement of the chest while breathing is given regarding nominal distance offset in <a href="#bookmark28">(1.3).</a> The breath and heartbeat frequencies are fb and fh in <a href="#bookmark29">(1.3).</a> The slow time vector is symbolized by t. The amplitudes that correspond to breath and heartbeat movement are given by mb and mh.The values of human breathing and heartbeat frequencies <a id="bookmark28"></a>are 0.2–0.33 Hz, 1–1.33 Hz (3).</p><p><em>d</em>(<em>t</em>) = <em>d</em>0+ <em>mb</em>sin (2π <em>fbt</em>)+ <em>mh </em>sin (2π <em>fht</em>) (1.3)</p><p><a id="bookmark29"></a>The time of arrival between human and the UWB radar system is given in <a href="#bookmark29">(1.4).</a></p><p>τd (t) = 2d(t)/c = τ0+ τbsin (2πfbt)+ τhsin (2πfht) (1.4) where c is the speed of light in vacuum, τ0 is the nominal delay time, τb and τh are the breath- ing and heartbeat related time delays. The received radar breath signal is convolution of trans- mitting signal and impulse response of the system where τ is the fast time in <a href="#bookmark30">(1.5).</a> The impulse <a id="bookmark30"></a>response of system is given by h(t,τ) and s(τ) is radar transmitting signal in <a href="#bookmark31">(1.5).</a></p><p>r( τ , t) = <em>s</em>(τ )∗h(t, τ ) (1.5)</p><p>Since UWB radar collects the echo signal reflected from the environment, such as clutter, walls, and humans, the received data is the superposition of echoes corresponding to different delays in slow time. The received signal is given in <a href="#bookmark31">(1.6)</a> where An is the amplitude of echo <a id="bookmark31"></a>signal at nth sample of bScan radar data.</p><p><img src="/media/202408//1724838578.1512508.png" /> (1.6)</p><p>The received signal includes the clutter and noise. After sampling process, the received signal is presented as:</p><p>R[m, n] = <em>h</em>[m, n] + <em>c</em>[m] + <em>n</em>[m, n] (1.7)</p><p>where h[m, n] corresponds to the presence of human body, c[m] is clutter, n[m, n] is total noise where m is the sampled fast time and n is the sampled slow time index. An example of data processing of UWB radar breath signal and reference signal is given in <a href="#bookmark27">Fig. 6.</a> The background removal of the collected radar data is achieved by Linear Trend Subtraction. The signal without <a id="bookmark32"></a>clutter is represented as in <a href="#bookmark32">(1.8).</a></p><p>R[m, n] = αvs(mδR − vτv (nTs)) (1.8)</p><p>where Ts is the pulse repetition interval of UWB radar, <em>t </em>= n<em>Ts </em>, <em>n </em>= 0,…,N-1, δR is the sampling interval in slow-time samples and v is the speed of light. The breath frequency can be extracted <a id="bookmark33"></a>by Fourier Transform of <a href="#bookmark30">(1.5)</a> in slow time which is expressed in <a href="#bookmark33">(1.9)</a><a href="#bookmark34">[19].</a></p><p><img src="/media/202408//1724838578.15918.png" />e−j2πftdt (1.9)</p><p>Where δT samling interval in fast time, f corresponds to frequency and t is the time. After range estimation and extraction of the breath signal, we applied a low-pass filter to eliminate unwanted noise. Then, we estimated the breath frequencies of reference and radar data.</p><p>The radar dataset articles are published by researchers in the literature are given in <a href="#bookmark35">Table 2.</a> The reference equipment, radar type and their operational characteristics, data fields, data formats, and the number of trials are presented. The datasets are focused on human activ- ity recognition that considers hand gestures, macro-human movements such as walking, falling, etc., and breath and heart rate estimation that regards apnea, post-exercise, speech, etc. The an- gle variations, human postures, and the cluttered environment are viewed in some cases, such as human-motion recognition. However, the datasets that focus on biomedical and civilian ap- plications need to be shared to develop ambient-assistant living systems and search and rescue radar systems and enhance their accuracy. Thus, In our dataset, we take the operational radar characteristics, human postures, different ranges, and antenna orientations as data recording pa- rameters.To validate our data, we used lidar as an additional source.</p><p>10 <em>C Eren, S Karamzadeh and M Kartal/ Data in Brief 51 (2023) 109757</em></p><p><strong>Table 2</strong></p><p>Datataset research in the literature.</p><table><tr><td><p>Ref.</p></td><td><p>Year</p></td><td><p>Reference Equipment</p></td><td><p>Radar Type</p></td><td><p>Radar Frequency</p></td><td><p>Methodology</p></td><td><p>Scenarios</p></td><td><p>Trial</p><p>Number</p></td><td><p>Data</p><p>Format</p></td></tr><tr><td><p>Proposed</p></td><td><p>2023</p></td><td><p>Lidar</p></td><td><p>UWB</p></td><td><p>5.3 GHz, 7.5 GHz,</p><p>8.8 GHz</p></td><td><p>FFT</p></td><td><p>Biomedical and</p><p>Ambient-Assistant Living</p></td><td><p>8</p></td><td><p>.mat</p></td></tr><tr><td><p><a id="bookmark35"></a><a href="#bookmark36">[20]</a></p><p><em>(5)</em></p></td><td><p>2022</p><p>2022</p></td><td><p>ECG and respiratory belts</p><p>–</p></td><td><p>Continuous Wave FMCW</p></td><td><p>24 GHz</p><p>5.8 GHz, 400 MHz Bandwidth</p></td><td><p>FFT</p><p>FFT and classification algorithms</p></td><td><p>Biomedical</p><p>Human Activity Recognition</p></td><td><p>1</p><p>Up to 3</p></td><td><p>.csv .dat</p></td></tr><tr><td><p><a href="#bookmark37">[21]</a></p></td><td><p>2022</p></td><td><p>–</p></td><td><p>Continuous Wave</p></td><td><p>Human Activity Recognition</p></td><td><p>FFT, STFT and classification algorithms</p></td><td><p>Human Activity Recognition</p></td><td><p>–</p></td><td><p>.jpg, .mat</p></td></tr><tr><td><p><a href="#bookmark38">[22]</a></p></td><td><p>2020</p></td><td><p>PPG, ECG, PCG, radar, respiration sensor</p></td><td><p>Continuos Wave</p></td><td><p>24 GHz</p></td><td><p>R peaks and T-wave end detection, STFT</p></td><td><p>Biomedical</p></td><td><p>Up to 30</p></td><td><p>.mat and .csv</p></td></tr><tr><td><p><a href="#bookmark39">[23]</a></p></td><td><p>2022</p></td><td><p>Kinect sensor, Wifi Channel State Information</p></td><td><p>UWB</p></td><td><p>6.5 GHz</p></td><td><p>Convolutional neurol network based classifier</p></td><td><p>Human Activity Recognition</p></td><td><p>Up to 6</p></td><td><p>.mat and .csv</p></td></tr><tr><td><p><a href="#bookmark40">[24]</a></p></td><td><p>2021</p></td><td><p> </p></td><td><p>UWB</p></td><td><p>500 MHz</p></td><td><p>FFT, STFT, EEMD, LTS</p><p>Convolutional neural network</p></td><td><p>Human Activity Recognition</p></td><td><p> </p></td><td><p>.jpg</p></td></tr><tr><td><p><a href="#bookmark41">[25]</a></p></td><td><p>2021</p></td><td><p>–</p></td><td><p>UWB</p></td><td><p>Biomedical research</p></td><td><p>Convolutional neurol network based classifier</p></td><td><p>Human Activity Recognition</p></td><td><p>1</p></td><td><p>.csv,.mat</p></td></tr><tr><td><p><a href="#bookmark42">[26]</a></p></td><td><p>2020</p></td><td><p>Task Force Monitor 3040i, ECG, ICG,</p></td><td><p>Continuous wave</p></td><td><p>24 GHz</p></td><td><p>Hidden-semi Markov model based segmentation, R-peak estimation</p></td><td><p>Biomedical</p></td><td><p>1</p></td><td><p>.mat</p></td></tr></table><p><em>C. Eren, S. Karamzadeh and M. Kartal/Data in Brief 51 (2023) 109757 </em>11</p><p><strong>Ethics Statements</strong></p><p>This study was approved by the Ethics Committees on Istanbul Aydın University on 24.07.20217 with the report number:2017/13 and Bahçes¸ ehir University on 11.12.2019-E.3051 with the report number: 20021704-604.02-.</p><p><strong>Funding</strong></p><p>This study was supported by Istanbul Aydın University, Scientific Research Project (BAP) re- search fund in 2017.</p><p><strong>Data Availability</strong></p><p><img src="/media/202408//1724838578.180923.png" /><a href="https://data.mendeley.com/datasets/cbj37wdsdj">Radar Human Breathing Dataset for Applications of Ambient Assisted Living and Search and Rescue Operations (Original data) (Mendeley Data)</a></p><p><strong>CRediT Author Statement</strong></p><p><strong>Cansu Eren: </strong>Data curation, Formal analysis, Investigation, Methodology, Software, Validation, Visualization, Writing – original draft; <strong>Saeid Karamzadeh: </strong>Supervision; <strong>Mesut Kartal: </strong>Supervi- sion.</p><p><strong>Declaration of Competing Interest</strong></p><p>The authors declare that they have no known competing financial interests or personal rela- tionships that could have appeared to influence the work reported in this paper.</p><p><strong>References</strong></p><p>[1] T. Shaik, X. Tao, N. Higgins, L. Li, R. Gururajan, X. Zhou, U.R. Acharya, Remote patient monitoring using artificial intelligence: current state, applications, and challenges, Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 13 (2023), <a id="bookmark4"></a><a id="bookmark5"></a><a id="bookmark6"></a>doi<a href="https://doi.org/10.1002/widm.1485">:10.1002/widm.1485.</a></p><p>[2] N. Casiddu, C. Porfirione, A. Monteriù, F. Cavallo, Lecture Notes in Electrical Engineering 540 Ambient Assisted Liv- ing, 2017 <a href="http://www.springer.com/series/7818">http://www.springer.com/series/7818.</a></p><p>[3] S. Yang, J.Le Kernec, O. Romain, F. Fioranelli, P. Cadart, J. Fix, C. Ren, G. Manfredi, T. Letertre, I.D.H. Saenz, J. Zhang,</p><p>H. Liang, X. Wang, G. Li, Z. Chen, K. Liu, X. Chen, J. Li, X. Wu, Y. Chen, T. Jin, The human activity radar challenge: benchmarking based on the “radar signatures of human activities” dataset from Glasgow University, IEEE J. Biomed. <a id="bookmark7"></a><a id="bookmark8"></a>Health Inform. 27 (2023) 1813–1824, doi<a href="https://doi.org/10.1109/JBHI.2023.3240895">:10.1109/JBHI.2023.3240895.</a></p><p>[4] M. Kebe, R. Gadhafi, B. Mohammad, M. Sanduleanu, H. Saleh, M. Al-qutayri, Human vital signs detection methods <a id="bookmark9"></a>and potential using radars: a review, Sensors 20 (2020), doi<a href="https://doi.org/10.3390/s20051454">:10.3390/s20051454.</a></p><p><img src="/media/202408//1724838578.2072802.png" />[5] P. Wang, F. Qi, M. Liu, F. Liang, H. Xue, Y. Zhang, H. Lv, J. Wang, Noncontact heart rate measurement based on an improved convolutional sparse coding method using IR-UWB radar, IEEE Access 7 (2019) 158492–158502, doi:10. 1109/ACCESS.2019.2950423.</p><p>[6] S. Nahar, T. Phan, F. Quaiyum, L. Ren, A.E. Fathy, O. Kilic, An electromagnetic model of human vital signs detection <a id="bookmark43"></a>and its experimental validation, IEEE J. Emerg. Sel. Top Circuits Syst. 8 (2018) 338–349, doi:10.1109/JETCAS.2018. 2811339.</p><p>[7] H.T. Yen, M. Kurosawa, T. Kirimoto, Y. Hakozaki, T. Matsui, G. Sun, A medical radar system for non-contact vital sign monitoring and clinical performance evaluation in hospitalized older patients, Biomed. Signal Process. Control 75</p><p><a id="bookmark44"></a><a id="bookmark10"></a>(2022), doi<a href="https://doi.org/10.1016/j.bspc.2022.103597">:10.1016/j.bspc.2022.103597.</a></p><p>[8] H. Hong, L. Zhang, H. Zhao, H. Chu, C. Gu, M. Brown, X. Zhu, C. Li, Microwave sensing and sleep, IEEE Microw. Mag. <a id="bookmark11"></a>20 (2019) 18–29, doi<a href="https://doi.org/10.1109/MMM.2019.2915469">:10.1109/MMM.2019.2915469.</a></p><p>[9] A.A. Pramudita, D.B. Lin, S.N. Hsieh, E. Ali, H.H. Ryanu, T. Adiprabowo, A.T. Purnomo, Radar system for detecting</p><p><img src="/media/202408//1724838578.244429.png" /><a href="https://doi.org/10.1109/JSEN.2022.3188165">respiration vital sign of live victim behind the wall, IEEE Sens. J. 22 (2022) 14670–14685, doi:10.1109/JSEN.2022. 3188165.</a></p><p>[10] C. Massaroni, A. Nicolò, D.Lo Presti, M. Sacchetti, S. Silvestri, E. Schena, Contact-based methods for measuring res-</p><p><a id="bookmark12"></a>piratory rate, Sensors 19 (2019), doi<a href="https://doi.org/10.3390/s19040908">:10.3390/s19040908.</a></p><p><img src="/media/202408//1724838578.275358.png" />12 <em>C. Eren, S. Karamzadeh and M. Kartal/Data in Brief 51 (2023) 109757</em></p><p><a id="bookmark13"></a><a id="bookmark14"></a><a href="http://refhub.elsevier.com/S2352-3409(23)00826-0/sbref0011">[11] M. Saad, A. Maali, M.S. Azzaz, A. Bouaraba, M. Benssalah, Development of an IR-UWB Radar System for High-Reso- lution Through-Wall Imaging, 2022.</a></p><p>[12] I. Immoreev, About UWB, IEEE Aerosp. Electron. Syst. Mag. 18 (2003) 8–10, doi<a href="https://doi.org/10.1109/MAES.2003.1246581">:10.1109/MAES.2003.1246581.</a></p><p><img src="/media/202408//1724838578.300205.png" /><a id="bookmark15"></a>[13] S.A. Shah, F. Fioranelli, human activity recognition: preliminary results for dataset portability using FMCW radar,</p><p>2019 International Radar Conference, RADAR 2019, Institute of Electrical and Electronics Engineers Inc., 2019, doi:10. 1109/RADAR41533.2019.171307.</p><p>[14] S.M.M. Islam, O. Boric-Lubecke, V.M. Lubecke, A.K. Moadi, A.E. Fathy, contactless radar-based sensors: recent ad- <a id="bookmark16"></a>vances in vital-signs monitoring of multiple subjects, IEEE Microw. Mag. 23 (2022) 47–60, doi:10.1109/MMM.2022. 3140849.</p><p>[15] T.K.V. Dai, Y. Yu, P. Theilmann, A.E. Fathy, O. Kilic, Remote vital sign monitoring with reduced random body swaying motion using heartbeat template and wavelet transform based on constellation diagrams, IEEE J. Electromagn. RF <a id="bookmark17"></a><a id="bookmark24"></a>Microw. Med. Biol. 6 (2022) 429–436, doi<a href="https://doi.org/10.1109/JERM.2022.3140900">:10.1109/JERM.2022.3140900.</a></p><p>[16] L. Liu, Z. Liu, B.E. Barrowes, Through-wall bio-radiolocation with UWB impulse radar: observation, simulation and signal extraction, IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 4 (2011) 791–798, doi<a href="https://doi.org/10.1109/JSTARS.2011.2157461">:10.1109/JSTARS.2011.2157461.</a></p><p>[17] Stmicroelectronics, This is information on a product in full productionWorld’s smallest Time-of-Flight Ranging and <a id="bookmark25"></a><a id="bookmark34"></a>Gesture Detection Sensor Datasheet-Production Data, 2021 <a href="http://www.st.com">www.st.com.</a></p><p>[18] Flatearth UWB Radar Solutions, Salsa Ancho Radar Module, Bozeman MT, 59718.</p><p>[19] X. Liang, H. Zhang, S. Ye, G. Fang, T.A. Gulliver, Improved denoising method for through-wall vital sign detection <a id="bookmark36"></a>using UWB impulse radar, Digit. Signal Process. 74 (2018) 72–93, doi<a href="https://doi.org/10.1016/j.dsp.2017.12.004">:10.1016/j.dsp.2017.12.004.</a></p><p>[20] K. Edanami, G. Sun, Medical radar signal dataset for non-contact respiration and heart rate measurement, Data Br.</p><p><a id="bookmark37"></a>(2022) 40, doi<a href="https://doi.org/10.1016/j.dib.2021.107724">:10.1016/j.dib.2021.107724.</a></p><p>[21] M. Chakraborty, H.C. Kumawat, S.V. Dhavale, A.A.B. Raj, DIAT-μ RadHAR (Micro-Doppler Signature Dataset) μ RadNet</p><p><img src="/media/202408//1724838578.339464.png" /><a id="bookmark38"></a><a href="https://doi.org/10.1109/JSEN.2022.3151943">(A Lightweight DCNN) - for human suspicious activity recognition, IEEE Sens. J. 22 (2022) 6851–6858, doi:10.1109/ JSEN.2022.3151943.</a></p><p>[22] K. Shi, S. Schellenberger, C. Will, T. Steigleder, F. Michler, J. Fuchs, R. Weigel, C. Ostgathe, A. Koelpin, A dataset of radar-recorded heart sounds and vital signs including synchronised reference sensor signals, Sci. Data 7 (2020), <a id="bookmark39"></a><a id="bookmark40"></a>doi<a href="https://doi.org/10.1038/s41597-020-0390-1">:10.1038/s41597-020-0390-1.</a></p><p>[23] M.J. Bocus, W. Li, S. Vishwakarma, R. Kou, C. Tang, K. Woodbridge, I. Craddock, R. McConville, R. Santos-Rodriguez, K. Chetty, R. Piechocki, OPERAnet, a multimodal activity recognition dataset acquired from radio frequency and <a id="bookmark41"></a>vision-based sensors, Sci. Data 9 (2022), doi<a href="https://doi.org/10.1038/s41597-022-01573-2">:10.1038/s41597-022-01573-2.</a></p><p>[24] Z. Zhengliang, Y. Degui, Z. Junchao, T. Feng, Dataset of human motion status using IR-UWB through-wall radar, J. Syst. Eng. Electron. 32 (2021) 1083–1096, doi<a href="https://doi.org/10.23919/JSEE.2021.000093">:10.23919/JSEE.2021.000093.</a></p><p><a id="bookmark42"></a>[25] S. Ahmed, D. Wang, J. Park, S.H. Cho, UWB-gestures, a public dataset of dynamic hand gestures acquired using impulse radar sensors, Sci. Data 8 (2021), doi<a href="https://doi.org/10.1038/s41597-021-00876-0">:10.1038/s41597-021-00876-0.</a></p><p>[26] S. Schellenberger, K. Shi, T. Steigleder, A. Malessa, F. Michler, L. Hameyer, N. Neumann, F. Lurz, R. Weigel, C. Ostgathe, A. Koelpin, A dataset of clinically recorded radar vital signs with synchronised reference sensor signals, Sci. Data 7</p><p>(2020), doi<a href="https://doi.org/10.1038/s41597-020-00629-5">:10.1038/s41597-020-00629-5.</a></p>
刘世财
2024年8月28日 17:49
转发文档
收藏文档
上一篇
下一篇
手机扫码
复制链接
手机扫一扫转发分享
复制链接
Markdown文件
HTML文件
PDF文档(打印)
分享
链接
类型
密码
更新密码