Abstract
To analyze rodent behaviors in non-conditioned animal models is an important task that enables a researcher to elaborated conclusions about the effects in the behavior after drug application. Because the amount of data generated in the use of this kind of test, an automatized system that can record these behaviors becomes relevant. There are several proposals aiming at identifying and tracking the rodent in the open field maze, however, behavior identification is a highly desirable feature that is not included. Other works can identify behaviors, but due to high computational costs, special computers or devices are required. In this work, we propose an automatic system based on features computed by a stochastic filter that allows the development of rules to detect specific behaviors exhibited in the open field maze. We demonstrate that it is possible to track a rodent and identify behaviors in real-time (30 fps) and also in high speed (>100Hz) without the need of powerful devices or special conditions for the environment.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Neuro-pharmacology is an important field that studies the effect of some drugs in the neural system. To prove this kind of effects, animals are used before a drug can be given to a human. The most common animals for the test are rodents specifically rats, that is because we know entirely its biology, also they are easy to breed and feed. There are mainly two ways to verify the effects in the neural system of the animal test: an invasive one in which is necessary to check the brain chemistry by a surgical that involves sacrificing the animal; the other one is a non-invasive method that uses behavioral animal models to observe the behaviors of the animal and compare before and after a drug application.
One of the most important tests is the Open Field Maze [16] (OFM, OFT). This test consists of a square box, it has a base and four walls. The typical sizes are between 50 cm to 100 cm. Inside the box, in the base, are painted a grid to identify zones in where the rodent stays. This test can be around 15 min long or a few hours. Usually the researchers in the field put the animal in the maze and record a video of all the test, next they watch the video to identify one or more behaviors, this process is repeated many times as necessary, so this causes that a systematic error is present in the results, also the measures can become variable depending of the personal interpretation.
Given this problem, one solution is the aid of an automatic system that can detect and account for the behaviors present during the open field test. There is some commercial system that is too expensive and leaves out the possibility of being acquired by those who need it. It is from here that many approaches have been proposed to give a solution to this need, The next section gives a review of the works proposed in the last years to approach an automatic system for behaviors and tracking.
2 Related Work
The task to analyze the rodent’s behavior when is placed in a test like the open field maze has been tried to solve in different ways since the early 80s when computer capabilities were still weak, a combination of algorithms and electronics were the first attempts reported [6, 7].
More recent approximations that can successfully track the rodent in the test [5, 8, 19, 21, 23] have not behavior detection available and that limits the potential results of the open field maze.
There are special cases where controlled conditions of light are necessary to remark a high contrast between the animal and the scenario to performs the identification of the rodent [1, 13], but these conditions are not always possible to set by the researchers in the neuroscience.
We found some cases where invasive techniques are used to track the rodent. In the works presented in [2, 4, 11, 14] a surgical implant to the animal is done to identify the rodent and track it, but this is not ideal because the animal is exposed to an unusual conditioning and could affect its behavior, in the other hand, invasive techniques changes the animal’s welfare.
In addition to the identification of the rodent, is important to detect some specific behaviors that are commonly presented during the test. For this reason, some works try to identify behaviors in the rodent using special devices (infrared camera, touch-panel, sensors) or more powerful computers (faster CPU, GPU) [8, 9, 12, 15, 22].
Other approximations are the uses of depth cameras to identify the rodent position and also gets its orientation, besides they can analyze more than one rodent, are not capable to identify behaviors like spinning or freezing and only detects rearing [10, 18, 20].
The research developed in [8, 15] can identify many behaviors of the rodent in the open field maze in a successful way, but a high-cost computer is used to performs this results and these devices are not accessible for many researchers in the field.
In not all cases the results are processed in real-time, in the others 30 fps system are developed but an especial dimension is set mainly \(320\times 240\) pixel frame is used [1, 9, 17, 19, 23]. The better approximations that perform real-time and behavior identification are limited to open field maze are [3, 20], and have not probed in other mazes that can result of interest for researchers.
As we view a system that can perform the identification and tracking of the rodent in real-time and also can identify behaviors is needed with the use of no high-cost computer. In this work, we propose a system in real-time that can track the rodent efficiently and also detect some specific behavior presents in the execution of the open field test.
3 Methods
To achieve the goal to develop a system that can perform the behaviors detection and rodent tracking, we propose the methodology described in the next sections.
3.1 System Calibration
The position of the camera and the characteristics of the box test (color, illumination, position, etc.) are the initial problems to solve. The system will not require a specific color in the arena and neither a specific position of the camera, instead a calibration process is implemented. When the test is ready, before put the rat inside the box, it is necessary to mark manually the corners of the box by click on them in the image showing by the system. This calibration removes the need to adjust the position of the camera to match a specific area. Thus, a time of 5 s of the camera recording the scenario allows the system to learn the characteristics of the arena without the need of use a particular box with some special color or illumination.
3.2 Rodent Segmentation and Tracking
Before we can do the rodent’s tracking, an observable parameter is needed in order to use the EKF, the parameter used is the centroid of the rat (ratCentroid). Based on the bgMean, each pixel of the current frame is analyzed by calculating its standard deviation and verifying if it is under the Gauss bell in about four standards deviations, classifying by background and non-background each pixel.
By the use of Eq. 1, we calculated the centroid of the rat from the segmentation.
3.3 Features Extraction
As mentioned in the previous section, the EKF is used to extract dynamic information from the rodent. We propose the dynamical model described in Eq. 2, from this model we obtain position (x, y), velocity (r), orientation (\(\theta \)), acceleration (\(\dot{r}\)) and angular velocity (\(\dot{\theta }\)).
For the predicted step (Eq. 9) we use the model from above and we use the rodent centroid calculated from segmentation as the observable parameter in 10
We need additional information about the rat-like shape, i.e., every time the rat is moving and present some behaviors its body shape changes. For example, when the rat is rearing its body stretches, or when it is grooming usually its body shrinking forming a circle form. For this reason, we calculate its body shape deformations by Principal Component Analysis (PCA). This let us reduce data and only have two lines that represent the height and width of the rat. With this we can know when the rat’s body looks like an ellipse or a circle, bringing us information about the things the rat is probably doing. Joining all these characteristics information we develop rules that can identify what are the rat’s behaviors. The Fig. 1 shows the features extracted from the rat.
3.4 Behaviors Detection
At this point we have identified the rodent and we know its position, velocity, direction (angle) and shape. Till now we can track the rodent motion, the next is know what the rodent is doing in every frame. The behaviors required for this test are wall rear, path distance, walking and freezing.
For wall rearing detect, we generated rules to classify if the rodent is rearing or not, taking the shape and orientation of the rodent. We observed in the test that when the rat is wall rearing two important characteristics are present, the body of the rat is over a defined limit that we can estimate, and when this occurs, its body stretches and the ellipse that is formed has the principal diagonal greater than the secondary one, with this analysis we estimate when a rearing is happening.
By the use of the features extracted we can detect the freezing, this means the absence of movement of the rodent. Using the velocity, we can estimate when the rodent is quiet and can label this behavior as freezing.
We estimate the distance traveled by rodent using velocity parameter, the velocity is given as the total pixels moved from the previous frame, this means we don’t calculate the velocity in terms of meters over seconds, instead is calculate how many pixels the rodent is moving in every time recorded. So, with this measure, and applying a rule based on the known size of the box we estimate the distance that the rodent has covered during the test.
4 Results
We present the analysis and the proposed solution in the section above. For the implementation, we use c++ with OpenCV library for video and image operations (opening, math operations), all programmed under Linux Ubuntu distribution with no special characteristics in the computer. We count with a data set to test the proposed solution, each video was tested with the system, then the result of every algorithm implemented is shown in order to verify the correct function of the system.
Supplementary video: https://youtu.be/6Smkff19r14.
4.1 Segmentation
For our propose, the first step is the system calibration, next the extraction of the rodent from the frames is required, by applying the algorithm explained in Sect. 3.2, we can separate the rodent from the rest of the background and we use the segmentation to calculate the centroid of the rodent, as we can see even the tail is not complete segmented (see Fig. 2a) the centroid is positioned correctly compared when the system preserves the complete tail in the segmentation (see Fig. 2b).
4.2 Tracking
As we explained early, the observable parameter for the EKF is the centroid obtained from segmentation. To estimate the accuracy of the centroid calculated, we compare the data resulting from the system with hand-labeled data for the centroid (see Fig. 3). We calculated the RMSE for the coordinates x and y. For x we obtain RMSE of 2.4, and 6.82 for y. With this, we make sure that the centroid calculated is good for EKF measurement, also we have to considerate that the hand-labeled data is not always in the exact center of the rodent.
In the Fig. 4, we observe the tracking of the rat estimated by EKF (red circles), We show the comparison between the original frame and the segmentation, thus we plot the tracking generate, all this for one representative video.
4.3 Behaviors Identification
The rules generated in the previous sections were applied to the data set. An example of the visual result for the rearing detection is showing in the Fig. 5. The Fig. 5a shows the original frame from video. In the Fig. 5b we paint a blue oval around the rat every time it performs a wall rearing in the box. Additional of this, we count every wall rearing and at the end of the process. Another result we can observe in the Fig. 5c is the information given by PCA, this information is painted in green and blue lines in the rodent segmentation representing the tendency of the shape of the rodent. The last result showed is the bounding box marked with a blue square. We do not process all the image, we only work in the area restricted by the bounding box, thus we speed up the process.
We can observe in the video that the camera position is not completely over the box, the camera has an inclination that causes a box distortion like a trapezoid shape, additionally, the box has not perfect square shape and this increases the distortion effect. Because of this, there are some positions of the rodent that confuse the algorithm and counts it as wall rearing.
4.4 Ethogram Generation
In the previous section, we show examples from the system operation in a specific frame. Given the amount of data generated for the entire video, the system generates a report for every frame of the video specifying what is the rodent activity in that frame. This report is called ethogram and is drawn as a colored graphic representing each behavior with one color. The Fig. 6 shows the Ethogram for video 2. We can observe from the ethogram that the behavior of the rodent is not constant.
The blue color represents when the rodent performs a wall rearing, the yellow one indicates that the rodent is walking and the orange shows when it is freezing.
At the beginning of the test, the rodent is not familiarized with the box and an exploration behavior is presented, this means the rodent have the need to sniffing (including wall rearing) and travel for all the box, that is what we found in the first part of the ethogram. After a few minutes, the wall raring is present for a longer time combining with walking. After the rat is familiarized with the environment its activity reduces drastically, this behavior is observed by the freezing (orange color) because the need to explore decreases in the rodent.
4.5 Time Execution
To evaluate the velocity to obtain results by our proposal, we measure the time required for each module. For this test, we divide the complete process into three steps: segmentation, prediction of position (tracking) and behavior detect. The Table 1 shows the mean times for the main blocks for each video. From the table, we notice that the process that takes the longest time is rodent segmentation and is the time predominant in the complete process. Computing the average time needed to complete each frame from the video we show that our proposal can run in real time, even more, the max speed is over 100 Hz. This time is better than most reported in the related work.
5 Conclusion and Future Work
In this paper, we have presented a system for rodents tracking and behaviors detection. In our proposal we didn’t change the initial conditions in the test, we worked directly on the videos without any prior information of manual adjustments. Even when the camera position was not the best, we correctly segmented and identified the rodent. In addition, our system was able to detect behaviors of particular interest in the test from which an ethogram was also generated, a graph that can be used by the experts to analyze the rodent’s behaviors along time and after having a applied a drug to the rodent.
Therefore, we demonstrated that it is possible to do tracking and behavior identification successfully without any special conditions an also our proposal runs in high speed over 100 Hz without requiring special hardware such as a GPU.
For future work, we propose the use of other classification techniques to detect more behaviors and compare with current results in order to improve the behaviors detection. Also, we will expand the work to other mazes like water maze or elevated plus maze and detect the corresponding behaviors presented in each test.
References
da Silva Aragão, R., Rodrigues, M.A.B., de Barros, K.M.F.T., Silva, S.R.F., Toscano, A.E., de Souza, R.E., Manhães-de-Castro, R.: Automatic system for analysis of locomotor activity in rodents–A reproducibility study. J. Neurosci. Methods 195(2), 216–221 (2011)
Howerton, C.L., Garner, J.P., Mench, J.A.: A system utilizing radio frequency identification (RFID) technology to monitor individual rodent behavior in complex social settings. J. Neurosci. Methods 209(1), 74–78 (2012)
van Dam, E.A., van der Harst, J.E., ter Braak, C.J., Tegelenbosch, R.A., Spruijt, B.M., Noldus, L.P.: An automated system for the recognition of various specific rat behaviours. J. Neurosci. Methods 218(2), 214–224 (2013)
Sourioux, M., et al.: 3-D motion capture for long-term tracking of spontaneous locomotor behaviors and circadian sleep/wake rhythms in mouse. J. Neurosci. Methods 295, 51–57 (2018)
Chanchanachitkul, W., Nanthiyanuragsa, P., Rodamporn, S., Thongsaard, W., Charoenpong, T.: A rat walking behavior classification by body length measurement. In: The 6th 2013 Biomedical Engineering International Conference, pp. 1–5 (2013)
Clarke, R.L., Smith, R.F., Justesen, D.R.: An infrared device for detecting locomotor activity. Behav. Res. Methods Instrum. Comput. 17(5), 519–525 (1985)
Gapenne, O., Simon, P., Lannou, J.: A simple method for recording the path of a rat in an open field. Behav. Res. Methods Instrum. Comput. 22(5), 443–448 (1990)
Geuther, B.Q., et al.: Robust mouse tracking in complex environments using neural networks. Commun. Biol. 2(1), 124 (2018)
Giancardo, L., Sona, D., Scheggia, D., Papaleo, F., Murino, V.: Segmentation and tracking of multiple interacting mice by temperature and shape information. In: Proceedings of the 21st International Conference on Pattern Recognition, ICPR 2012, pp. 2520–2523 (2012)
Hong, W., et al.: Automated measurement of mouse social behaviors using depth sensing, video tracking, and machine learning. Proc. Nat. Acad. Sci. 112(38), E5351–E5360 (2015)
Jia, Y., Wang, Z., et al.: A wirelessly-powered homecage with animal behavior analysis and closed-loop power control. In: 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 6323–6326 (2016)
Lai, P.L., Basso, D.M., Fisher, L.C., Sheets, A.L.: 3d tracking of mouse locomotion using shape-from-silhouette techniques (2011)
Linares-Sánchez, L.J., Fernández-Alemán, J.L., García-Mateos, G., Pérez-Ruzafa, Á., Sánchez-Vázquez, F.J.: Follow-me: a new start-and-stop method for visual animal tracking in biology research. In: 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 755–758 (2015)
Macrì, S., Mainetti, L., Patrono, L., Pieretti, S., Secco, A., Sergi, I.: A tracking system for laboratory mice to support medical researchers in behavioral analysis. In: 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 4946–4949 (2015)
Sebov, K.: Deep rearing. Stanford University, Technical report (2017)
Seibenhener, M., Wooten, M.C.: Use of the open field maze to measure locomotor and anxiety-like behavior in mice. J. Vis. Exp. (JoVE) 96, e52434 (2015)
Shi, Q., Miyagishima, S., Fumino, S., Konno, S., Ishii, H., Takanishi, A.: Development of a cognition system for analyzing rat’s behaviors. In: 2010 IEEE International Conference on Robotics and Biomimetics, pp. 1399–1404 (2010)
da Silva Monteiro, J.P.: Automatic Behavior Recognition in Laboratory Animals using Kinect. Master’s thesis, Faculdade de Engenharia da Universidade do Porto (2012)
Tungtur, S.K., Nishimune, N., Radel, J., Nishimune, H.: Mouse behavior tracker: an economical method for tracking behavior in home cages. BioTechniques 63(5), 215–220 (2017)
Wang, Z., Mirbozorgi, S.A., Ghovanloo, M.: Towards a kinect-based behavior recognition and analysis system for small animals. In: 2015 IEEE Biomedical Circuits and Systems Conference (BioCAS), pp. 1–4 (2015)
Wilson, J.C., Kesler, M., Pelegrin, S.L.E., Kalvi, L., Gruber, A., Steenland, H.W.: Watching from a distance: A robotically controlled laser and real-time subject tracking software for the study of conditioned predator/prey-like interactions. J. Neurosci. Methods 253, 78–89 (2015)
Xie, X.S., et al.: Rodent Behavioral Assessment in the Home Cage using the Smartcage™ System, pp. 205–222. Humana Press, Totowa, NJ (2012)
Ziegelaar, M.: Development of an inexpensive, user modifiable automated video tracking system for rodent behavioural tests. Master’s thesis, School of Mechanical and Mining Engineering (2015)
Acknowledgments
We thank Ilhuicamina Daniel Limón Pérez de León, Ph.D., head research of Neuroscience laboratory from Benemérita Universidad Autónoma de Puebla for provided to us the data set for the evaluation and for the guidance about the behaviors that are interesting to detect in the open field maze.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Cocoma-Ortega, J.A., Martinez-Carranza, J. (2019). Towards a Rodent Tracking and Behaviour Detection System in Real Time. In: Carrasco-Ochoa, J., Martínez-Trinidad, J., Olvera-López, J., Salas, J. (eds) Pattern Recognition. MCPR 2019. Lecture Notes in Computer Science(), vol 11524. Springer, Cham. https://doi.org/10.1007/978-3-030-21077-9_15
Download citation
DOI: https://doi.org/10.1007/978-3-030-21077-9_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-21076-2
Online ISBN: 978-3-030-21077-9
eBook Packages: Computer ScienceComputer Science (R0)