Hello,
The default openni_tracker package allows the detection and tracking of a skeleton from a live stream of a Kinect.
However, I have a point cloud data (Type: sensor_msgs/PointCloud2) initially generated by a Kinect sensor and then modified (applying transformation, down-sampling ...).
I want to track a skeleton in this point cloud. Any ideas ?
Regards
↧
Skeleton tracker using point cloud
↧
Installation dependency problem openni_tracker for Indigo (on Ubuntu 14.04 Trusty)
Hi,
I try to install openni_tracker for Indigo(on ubuntu 14.04 [trusty] to use with Kinect). I know Nite library is not available as .deb. A friend suggested to me to add the precise main respository(because Nite is available) and clone openni_tracker from git repo (as written below), but in a certain point it gives me an error:
Adding precise main to the sources:
sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu precise main" > /etc/apt/sources.list.d/ros-latest.list'
sudo aptitude update
creating openni_tracker directory
mkdir -p ~/openni_tracker/src
cd ~/openni_tracker/src
catkin_init_workspace
cd ~/openni_tracker
catkin_make
source ~/openni_tracker/devel/setup.bash
echo "source ~/openni_tracker/devel/setup.bash" >> ~/.bashrc
Cloning openni_tracker from github repo
git clone https://github.com/uoa-robotics/openni_tracker.git -b indigo-devel src/openni_tracker
Until here i don't have any problem. Then i call
rosdep install openni_tracker
and it gives me the error:
> ERROR: the following packages/stacks could not have their rosdep keys resolved to system dependencies:>> openni_tracker: no definition of [libopenni-nite-dev] for OS version [trusty]
I installed libopenni-nite-dev via synaptic, it's installed, i can see it. But it still gives me the error. How can i resolve this problem? Thanks
↧
↧
how to [Prime Sense User Tracker Viewer]
hello,
Like the same question [link text](http://answers.ros.org/question/10637/how-to-prime-sense-user-tracker-viewer/),I can work now with[ openni_tracker](http://wiki.ros.org/openni_tracker) and i also can visualize the published tf data on rviz but i dont now how to start Prime Sense User Tracker Viewer, like [openni_tracker_png](http://www.ros.org/wiki/openni_tracker?action=AttachFile&do=get&target=nite.png).
i want to add the skeleton detected on to the corresponding rgb picture,does it possible ? does there exist packages can achieve this ?
i am a newer person to kinect and computer vision , i use hydro. Any suggestion will be appreciated !!!
Thank you !!
↧
How does openni_tracker distinguish left or right, front or back?
Hi guys,
I have been using openni_tracker for a while, using the position of some joints and direction of the person. However, now I add more rotation to the person, and the direction of the person seems messed up. After trying rotation and observe closely, I saw that when the person's side just rotates pass the Kinect, left and right joints are switched, making the direction of the person also switched.
So it is:
Before rotation
left | right
After 180 degree
left | right (should be right | left )
And after another 180 degree
left | right
I also tried rotating while lifting my right hand up, and now the left and right joints are not messed. I think it is because the lifted right hand provide something for the tracker to track.
Does anyone have any solutions about correctly detecting person's direction?
↧
How can I setup skeleton tracking using a Kinect and ROS Indigo on Ubuntu 14.04?
I read about openni_tracker, but that doesn't work with Indigo, so, I'm struck. Any help would be much appreciated. Thanks!
↧
↧
openni_tracker failed because of InitFromXml failed
Hello ,I am a new user of ROS openni.
Now I'm using ubuntu 12.04 and my ROS dist is groovy.
when I run the command rosrun openni_tracker openni_tracker,the only thing I got is a error message:
InitFromXml failed: Can't create any node of the requested type!
I followed several solutions on the ROS community,still get the same error message
I googled a lot ,but not a correct solution for me.Any help will be appreciated ,It really drives me crazy.
↧
nite or openni_tracker ?
hi there again, now i got confused with nite package (which called openni_tracker today huh?)
i'm using ros indigo, i got problem with package 'nite' from ros-file-electric, so i changed it to openni2_tracker.
i've installed openni_camera, openni2_camera, openni_launch, openni2_launch, openni_tracker, openni2_tracker, and still got error when i `catkin_make`:> CMake Error: The following variables are used in this project, but they are set to NOTFOUND.> Please set them or make sure they are set and tested correctly in the CMake files:> Nite2_INCLUDEDIR> used as include directory in directory /home/adelleodel/rosadel/src/openni2_tracker> used as include directory in directory /home/adelleodel/rosadel/src/openni2_tracker> used as include directory in directory /home/adelleodel/rosadel/src/openni2_tracker> used as include directory in directory /home/adelleodel/rosadel/src/openni2_tracker> .....> used as include directory in directory /home/adelleodel/rosadel/src/openni2_tracker> Nite2_LIBRARY> linked by target "openni2_tracker" in directory /home/adelleodel/rosadel/src/openni2_tracker> -- Configuring incomplete, errors occurred!
this is my edited CMakeList.txt :
....
find_package(openni2_tracker REQUIRED)
....
what should i change in CMakeList ? i think i miss something. thankyou for your helping!
↧
Problem tracking user in openni_tracker
Hi, I’m using openni tracker to follow a person with a mobile robot. (indigo, ubuntu 14, kinect)
I can follow the person forward and backward but, when the robot turns the program openni_tracker always lost the user.
Is some code I can add to the program to solve this? any suggestion?
Thanks in advance
↧
How to use Openni_Tracker with a moving camera?
Hi, I know that openni tracker doesn’t support a moving camera application. I read that changing the SceneAnalyzer with one that handles a moving camera may work. You know how?
Or maybe someone knows a simple package to track 1 person in a mobile system using a kinect
Thanks!
↧
↧
NiTE v1.5.2.23 not detected: NITE is likely missing
it's so weird that everytime i want to run openni_tracker `rosrun openni_tracker openni_tracker` such in KinectWithROSTutorial.pdf that i have, at terminal it always shown up:
[ERROR] [1454912600.620223128]: NITE is likely missing: Please install NITE >= 1.5.2.21. Check the readme for download information. Error Info: User generator failed: Can't create any node of the requested type!
eventough i have install NiTE v1.5.2.23 with tutorial in [here](https://www.kdab.com/setting-up-kinect-for-programming-in-linux-part-1/). and also try download it from the [link](http://www.openni.ru/openni-sdk/openni-sdk-history-2/) in README files that mention in terminal.
the thing is, i've succeed run the example of the nite which mention in tutorial [there](http://www.kdab.com/setting-up-kinect-for-programming-in-linux-part-1/). but it failed when i run openni_tracker.
i also succeed `roslaunch openni_launch openni.launch` and tried `rosrun image_view image_view image:=/camera/rgb/image_color`.
so what's wrong with my program then?
↧
Error using openni_tracker with pr2_simulator
I'm attempting to use the openni_tracker package in conjunction with the pr2_simulator simulation.
I've followed instructions to do so from this page, [here](https://github.com/jstnhuang/cse481c_tutorials/wiki/How-to-run-openni_tracker).
I complete the installation (without error) through the step to `rosrun openni_tracker openni_tracker`. At that point, when trying to rosrun the package, I get this error:
joe@ROS:~$ rosrun openni_tracker openni_tracker
/home/joe/catkin_ws/src/openni_tracker/openni_tracker.xml
[ERROR] [1458319658.103197859]: InitFromXml failed: Can't create any node of the requested type!
How do I verify that openni_tracker is subscribing to the correct topic? And would that be related to the failure of the XML to load?
Any ideas?
Thank you.
↧
tf Listener for OpenNI Tracker [Solved]
Hello all,
I want get joint coordinates of skeleton. I understand that I can get this information using tf listener as mentioned here. http://wiki.ros.org/tf/Tutorials/Writing%20a%20tf%20listener%20%28C%2B%2B%29
But I have problem in understanding how to mention each joint(from pi pose) in this section of code.
while (node.ok())
{
tf::StampedTransform transform;
try{
listener.lookupTransform("/turtle2", "/turtle1",
ros::Time(0), transform);
}
is it possible for any one to guide me or provide an example code so that I can get all the joint coordinates.
**EDIT**
int main(int argc, char** argv)
{
ros::init(argc, argv, "my_tf_listener");
ros::NodeHandle node;
ros::service::waitForService("spawn");
ros::ServiceClient add_turtle = node.serviceClient("spawn");
turtlesim::Spawn srv;
add_turtle.call(srv);
tf::TransformListener listener;
ros::Rate rate(1.0);
while (node.ok())
{
tf::StampedTransform transform;
try
{
listener.lookupTransform("/head", "/neck","/torso",
"/left_shoulder","/left_elbow","/left_hand",
"/right_shoulder","/right_elbow","/right_hand",
"/left_hip","/left_knee","/left_foot",
"/right_hip","/right_knee","/right_foot",
ros::Time(0), transform);
}
catch (tf::TransformException &ex)
{
ROS_ERROR("%s",ex.what());
ros::Duration(1.0).sleep();
continue;
}
rate.sleep();
}
return 0;
};
I'm trying to do this, but I'm getting error during compilation.
error: no matching function for call to ‘tf::TransformListener::lookupTransform(const char [6], const char [6], const char [7], const char [15], const char [12], const char [11], const char [16], const char [13], const char [12], const char [10], const char [11], const char [11], const char [11], const char [12], const char [12], ros::Time, tf::StampedTransform&)’
ros::Time(0), transform);
^
Thanks,
Regards,
Prasanna
↧
Twist and /tf listener question
I'm making a /tf listener that listens the output of a kinnect device.
I'd like to publish the velocity of the /left_hand_1 child of the /tf. So far, looking at the `Transformer::lookupTwist` function, I have these bits figured out but am missing pieces:
`Transformer::lookupTwist("/left_hand_1",
"/torso",
"reference_frame",
"reference_point",
"reference_point_frame",
ros::Time::now()+ros::Duration(0.1),
"ros::Duration()",
vel_msg);`
Format of the Transformer::lookupTwist can be found [here](http://docs.ros.org/electric/api/tf/html/c++/classtf_1_1Transformer.html#a150d91fc9f25e0a5fcb8ef5aea3ec645). My question is about the "reference_frame", "reference_point", "reference_point_frame", parts of the function... what do they refer to? Is the "reference_frame" the same as how I would observe the "/tf" values in rviz? ie, "openni_camera"...? What about the "reference_point" and reference_point_frame"?
Thank you for the help.
↧
↧
skeleton_marker / Openni_tracker Publish Depth Image
Folks,
skeleton_marker and openni_tracker only publish TF data. How can I publish the depth image being used inside skeleton_tracker? I am converting the data in the skeleton_tracker.cpp, but I get a black image on my node subscribed the data, like so:
g_kinect_controller.getDepthGenerator().GetMetaData(depthMD);
.
.
cv::Mat depthWrapper(depthMD.YRes(), depthMD.XRes(), CV_32FC1, (void*) depthMD.Data());
sensor_msgs::ImagePtr msg = cv_bridge::CvImage(std_msgs::Header(),
sensor_msgs::image_encodings::TYPE_32FC1, depthWrapper).toImageMsg();
depth_image_pub_.publish(msg);
On my subscribing node I follow the standart cv_bridge subscriber tutorial, but I use 32FC1 instead of "brg8," then I convert to 8UC1 to process the image. But I get weird images, and nothing like the image being displayed in the skeleton_tracker OpenGL window.
↧
Multiple objects (with multiple node handles?)
Folks,
I am using openni_tracker obtain the number of users inside my Kinect view.
From another node, subscribed to openni_tracker, I want to create a user object. When I create the object, it initializes its own node handle, and it's own subscribers/publishers and call back functions etc. But when a user disappears, I want to destroy the respective object. Spawning a new ROS node per new person is not fast enough because I need to do other things like gesture and face recognition, etc., which make the spawning/shutdown of a node slow.
1) Can I create several objects, each with it own node handle? I was thinking of using a callback subscribe to openni_tracker, from within my "people detected" callback function, and append a new object to a list or vector of people, sort of like this:
// a UserClass object has it's own node handles and subscribers/publishers
vector users;
void chatterCallback(const openni::People::ConstPtr& msg)
{
// check if a user disappear, and erase from vector/list
//....write code
// check if new user came in
if (msg->new_user)
{
users.push_back(UserClass(msg->new_user_id)); // or something similar
}
}
int main(int argc, char** argv)
{
ros::init(argc, argv, "v_recognize_gesture");
ros::NodeHandle n;
ros::Rate loop_rate(2);
ros::Subscriber sub = n.subscribe("people", 1000, chatterCallback);
while (ros::ok())
{
ros::spinOnce();
loop_rate.sleep;
}
return 0;
}
2) Is it better to pass a global my main function's node handle into each UserClass object?
3) Would it be good to create a boost::thread pointer that can some how create a new object every time?
Hope my question is not too confusing
↧
Openni_tracker giving an error message "InitFromXml failed"?
I am using ros indigo and kinect xbox.
When i run the "**rosrun openni_tracker openni_tracker**", i was getting error **InitFromXml failed: Can't create any node of the requested type!**
I followed the suggestion provided from this [link1](http://answers.ros.org/question/9737/openni_tracker-initfromxml-failed-cant-create-any-node-of-the-requested-type/) and [link2](http://answers.ros.org/question/130891/problems-with-openni_tracker-launching/)
Here is the sequence of commands that i followed. I have installed `sudo apt-get install ros-indigo-freenect-stack` and `NITE-Bin-Dev-Linux-x64-v1.5.2.23` roslaunch freenect_launch freenect.launch & roslaunch pr2_LookAround_pkg face_detector.rgbd.launch & rosrun rviz rviz I tried by adding license `sudo niLicense -l 0KOIk2JeIBYClPWVnMoRKn5cdY4=` , but no luck.
When i run the "**rosrun openni_tracker openni_tracker**", i was getting error **InitFromXml failed: Can't create any node of the requested type!**
I followed the suggestion provided from this [link1](http://answers.ros.org/question/9737/openni_tracker-initfromxml-failed-cant-create-any-node-of-the-requested-type/) and [link2](http://answers.ros.org/question/130891/problems-with-openni_tracker-launching/)
Here is the sequence of commands that i followed. I have installed `sudo apt-get install ros-indigo-freenect-stack` and `NITE-Bin-Dev-Linux-x64-v1.5.2.23` roslaunch freenect_launch freenect.launch & roslaunch pr2_LookAround_pkg face_detector.rgbd.launch & rosrun rviz rviz I tried by adding license `sudo niLicense -l 0KOIk2JeIBYClPWVnMoRKn5cdY4=` , but no luck.
↧
ROS Jade on ubuntu 14.04 E: Unable to locate package ros-jade-openni-tracker
I am planning to work on parrot Jumping Sumo by using gesture control using ROS and Kinect. I have installed the all the required packages except openni_tracker. When i try the conmmand :
sudo apt-get install ros-jade-openni-tracker
I get the following error:
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package ros-jade-openni-tracker
Can someone please help me solve the issue?
↧
↧
follow a person using openni_tracker
Hi guys, I have openni_tracker working and I would try to make my pioneer follow the detected user.
My idea was to use the tf data give by openni_tracker but I don't how to use them to make the robot moving.
Do you guys have any idea?
Thanks
↧
How to guarantee tf last transformation in skeleton tracker
Hello all,
I am programming a gesture recognition node using the skeleton TF tree provided by the node openni_tracker (using the depth information of a Kinect at 30 FPS).
Under certain conditions (slight body angles not facing perfectly the Kinect sensor) the tf output seems to be noisy and the algorithm fails to recognize the gestures.
The original code to read the tf tree and construct the gesture vector is as follows:
int ReadTransform(std::string targetFrame, std::string sourceFrame, std::vector&transformVector)
{
tf::StampedTransform transform;
transformVector.clear();
try
{
listener->waitForTransform(targetFrame, sourceFrame,ros::Time(0),ros::Duration(0.3));
listener->lookupTransform(targetFrame, sourceFrame, ros::Time(0), transform);
}
catch (tf::TransformException &ex)
{
ROS_ERROR("%s",ex.what());
ros::Duration(1.0).sleep();
return -1;
}
transformVector.push_back(transform.getOrigin().x());
transformVector.push_back(transform.getOrigin().y());
transformVector.push_back(transform.getOrigin().z());
transformVector.push_back(transform.getRotation().x());
transformVector.push_back(transform.getRotation().y());
transformVector.push_back(transform.getRotation().z());
transformVector.push_back(transform.getRotation().w());
return 0;
}
int ReadTransforms(std::vector&gestureVector)
{
std::vector transformVector;
std::string user = boost::lexical_cast(calibratedUser);
gestureVector.clear();
if(ReadTransform("left_hand_" + user,"torso_" + user, transformVector) < 0) return -1;
InsertPointToGesture(transformVector, gestureVector);
if(ReadTransform("left_elbow_" + user,"torso_" + user, transformVector) < 0) return -1;
InsertPointToGesture(transformVector, gestureVector);
if(ReadTransform("left_shoulder_" + user,"torso_" + user, transformVector) < 0) return -1;
InsertPointToGesture(transformVector, gestureVector);
if(ReadTransform("neck_" + user,"torso_" + user, transformVector) < 0) return -1;
InsertPointToGesture(transformVector, gestureVector);
if(ReadTransform("head_" + user,"torso_" + user, transformVector) < 0) return -1;
InsertPointToGesture(transformVector, gestureVector);
if(ReadTransform("right_shoulder_" + user,"torso_" + user, transformVector) < 0) return -1;
InsertPointToGesture(transformVector, gestureVector);
if(ReadTransform("right_elbow_" + user,"torso_" + user, transformVector) < 0) return -1;
InsertPointToGesture(transformVector, gestureVector);
if(ReadTransform("right_hand_" + user,"torso_" + user, transformVector) < 0) return -1;
InsertPointToGesture(transformVector, gestureVector);
return 0;
}
Pros: The return from "lookupTransform" is immediate, very fast.
Cons: I do not have guaranteed the latest transformation. Could be possible to have many tf transformations queued and read tranforms occurred some seconds ago?
I have modified the listener part:
transformVector.clear();
try
{
ros::TIme t = ros::Time::now()
listener->waitForTransform(targetFrame, sourceFrame, t, ros::Duration(0.3));
listener->lookupTransform(targetFrame, sourceFrame, t, transform);
}
Pros: Last transform guaranteed.
Cons: MUCH SLOWER (almost 0.5 s to perform eight transforms and construct the gesture vector)
Is there an alternative way to get the last transformation available in tf while increasing the speed?
Thank you all very much,
Alberto
↧
gesture recognition with kinect
hi there.... i want to use gesture recognition feature of kinect for my project. i searched abt it and i got package openni_tracker. but it is doesn't with ROS Indigo version. i hav already installed NITE package... but still i am not able to run the package. it says after running:
rosrun openni_tracker openni_tracker
[ERROR] [1483611464.589659387]: InitFromXml failed: Can't create any node of the requested type!
i followed steps mentioned in this answer to a same question...[skeleton tracking using a Kinect and ROS Indigo on Ubuntu 14.04?](http://answers.ros.org/question/214421/how-can-i-setup-skeleton-tracking-using-a-kinect-and-ros-indigo-on-ubuntu-1404/)
↧