matlab android camera and object tracking

7 views (last 30 days)
by refering to this code:
url = 'http://<ip address>/shot.jpg';
ss = imread(url);
fh = image(ss);
while(1)
ss = imread(url);
set(fh,'CData',ss);
drawnow;
end
i able to display the video from android phone ( using app ip camera), and i dont know on how to combine this code with below code:
vid = videoinput('winvideo', 1);
% Set the properties of the video object
set(vid, 'FramesPerTrigger', Inf);
set(vid, 'ReturnedColorspace', 'rgb')
vid.FrameGrabInterval = 5;
%start the video aquisition here
start(vid) ( here the video is taken from webcam, i want it to be taken from my android phone)
% Set a loop that stop after 100 frames of aquisition
while(vid.FramesAcquired<=200)
% Get the snapshot of the current frame
data = getsnapshot(vid);
% Now to track red objects in real time
% we have to subtract the red component
% from the grayscale image to extract the red components in the image.
diff_im = imsubtract(data(:,:,1), rgb2gray(data));
%Use a median filter to filter out noise
diff_im = medfilt2(diff_im, [3 3]);
% Convert the resulting grayscale image into a binary image.
diff_im = im2bw(diff_im,0.18);
% Remove all those pixels less than 300px
diff_im = bwareaopen(diff_im,300);
% Label all the connected components in the image.
bw = bwlabel(diff_im, 8);
% Here we do the image blob analysis.
% We get a set of properties for each labeled region.
stats = regionprops(bw, 'BoundingBox', 'Centroid');
% Display the image
imshow(data)
hold on
%This is a loop to bound the red objects in a rectangular box.
for object = 1:length(stats)
bb = stats(object).BoundingBox;
bc = stats(object).Centroid;
rectangle('Position',bb,'EdgeColor','r','LineWidth',2)
plot(bc(1),bc(2), '-m+')
a=text(bc(1)+15,bc(2), strcat('X: ', num2str(round(bc(1))), ' Y: ', num2str(round(bc(2)))));
set(a, 'FontName', 'Arial', 'FontWeight', 'bold', 'FontSize', 12, 'Color', 'yellow');
end
hold off
end
% Both the loops end here.
% Stop the video aquisition.
stop(vid);
% Flush all the image data stored in the memory buffer.
flushdata(vid);
% Clear all variables
clear all

Accepted Answer

Walter Roberson
Walter Roberson on 15 Mar 2014
Replace
data = getsnapshot(vid);
with
data = imread(url);
  6 Comments
Diec Thuan
Diec Thuan on 11 Sep 2021
in the code, I don't see the declaration anywhere?
Walter Roberson
Walter Roberson on 11 Sep 2021
If you are talking about the code
fh = image(ss);
while(1)
ss = imread(url);
set(fh,'CData',ss);
drawnow;
end
then what that code is doing is updating the CData property associated with the fh variable, where the fh variable has been created as an image graphics object. The CData property is the part of image graphics objects that is used to store the information about what array of values is to be displayed. It is being initialized in this particular section of code according to whatever the initial value of ss was at the time fh was created.

Sign in to comment.

More Answers (2)

PIYUSH KUMAR
PIYUSH KUMAR on 14 Sep 2015
Here's the working code of color detection using android camera:
url = 'http://192.168.0.100:8080/shot.jpg';
framesAcquired = 0;
while (framesAcquired <= 50)
data = imread(url);
framesAcquired = framesAcquired + 1;
diff_im = imsubtract(data(:,:,1), rgb2gray(data)); % subtracting red component from the gray image
diff_im = medfilt2(diff_im, [3 3]); % used in image processing to reduce noise and for filtering
diff_im = im2bw(diff_im,0.18); % convert image to binary image
stats = regionprops(diff_im, 'BoundingBox', 'Centroid'); % measures a set of properties for each connected component in the binary image
drawnow;
imshow(data);
hold on
for object = 1:length(stats)
bb = stats(object).BoundingBox;
bc = stats(object).Centroid;
rectangle('Position',bb,'EdgeColor','b','LineWidth',2)
plot(bc(1),bc(2), '-m+')
end
hold off
end
%stop(vid); % to stop the video
%flushdata(vid); % erase the data video
clear all

Shahroze Hussain
Shahroze Hussain on 6 Jan 2017
Edited: Walter Roberson on 6 Jan 2017
url = 'http://192.168.10.2:8080/shot.jpg';
ss = imread(url);
fh = image(ss);
while(1)
ss = imread(url);
set(fh,'CData',ss);
drawnow;
end
foregroundDetector = vision.ForegroundDetector('NumGaussians', 3, ...
'NumTrainingFrames', 50);
videoReader = vision.VideoFileReader('visiontraffic.avi');
for i = 1:150
frame = step(videoReader); % read the next video frame
foreground = step(foregroundDetector, frame);
end
%figure;
%imshow(frame);
%title('Video Frame');
%figure;
%imshow(foreground);
%title('Foreground');
se = strel('square', 3);
filteredForeground = imopen(foreground, se);
%figure;
%imshow(filteredForeground);
%title('Clean Foreground');
blobAnalysis = vision.BlobAnalysis('BoundingBoxOutputPort', true, ...
'AreaOutputPort', false, 'CentroidOutputPort', false, ...
'MinimumBlobArea', 150);
bbox = step(blobAnalysis, filteredForeground);
result = insertShape(frame, 'Rectangle', bbox, 'Color', 'red');
numCars = size(bbox, 1);
result = insertText(result, [10 10], numCars, 'BoxOpacity', 1, ...
'FontSize', 14);
%figure;
%imshow(result);
%title('Detected Cars');
videoPlayer = vision.VideoPlayer('Name', 'Detected Cars');
videoPlayer.Position(3:4) = [650,400]; % window size: [width, height]
se = strel('square', 3); % morphological filter for noise removal
while ~isDone(videoReader)
frame = step(videoReader); % read the next video frame
% Detect the foreground in the current video frame
foreground = step(foregroundDetector, frame);
% Use morphological opening to remove noise in the foreground
filteredForeground = imopen(foreground, se);
% Detect the connected components with the specified minimum area, and
% compute their bounding boxes
bbox = step(blobAnalysis, filteredForeground);
% Draw bounding boxes around the detected cars
result = insertShape(frame, 'Rectangle', bbox, 'Color', 'red');
% Display the number of cars found in the video frame
numCars = size(bbox,1);
asciiChars = char(numCars+'A'-1);
result = insertText(result, [10 10], asciiChars, 'BoxOpacity',1,...
'FontSize', 14,'BoxColor','Green');
step(videoPlayer, result); % display the results
end
asciiChars = char(numCars+'A'-1)
release(videoReader); % close the video file
How to combine these two codes
  3 Comments
Rohil Setia
Rohil Setia on 4 May 2018
not working. it is showing that blobAnalysis as undefined.
Walter Roberson
Walter Roberson on 7 May 2018
You will need to add in
blobAnalysis = vision.BlobAnalysis('BoundingBoxOutputPort', true, ...
'AreaOutputPort', false, 'CentroidOutputPort', false, ...
'MinimumBlobArea', 150);

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!