3

I have an IP Camera that receives a char buffer containing an image over the network. I cant access it until i setup the connection to it in a program. I am trying to dissect windows source filter code and im not going very fast so i thought i'd ask if it was possible to just take a buffer like that and cast it to something that could then connect a pin to AVISplitter or such in Directshow/.net

(video buffer from IP Cam) -> (???) -> (AVI Splitter) -> (Profit)

Update

I have my program capturing video in a namespace, and i have this code from the GSSF in its own namespace. I pass a ptr with an image from the cam namespace to the GSSF namespace. This only occurs once, but the graph streams from this one image, and the camera streams from the network. is there a way to continually pass the buffer from cam to GSSF or should i combine the namespaces somehow? I tried sending the main camera pointer to the GSSF but it crashed because its accessing the pointer and its being written. maybe if i grabbed an image, passed the pointer, waited to grab a new one?

*Update*

I shrunk my code and I don't believe im doing the namespace correctly either now that i look at it.

namespace Cam_Controller
{
    static byte[] mainbyte = new byte[1280*720*2];
    static IntPtr main_ptr = new IntPtr();

    //(this function is threaded)
    static void Trial(NPvBuffer mBuffer, NPvDisplayWnd mDisplayWnd, VideoCompression compressor)
    {
        Functions function = new Functions();
        Defines define = new Defines();
        NPvResult operationalResult = new NPvResult();
        VideoCompression mcompressor = new VideoCompression();

        int framecount = 0;
        while (!Stopping && AcquiringImages)
        {
            Mutex lock_video = new Mutex();
            NPvResult result = mDevice.RetrieveNextBuffer(mBuffer, operationalResult);

            if(result.isOK())
            {
                framecount++;
                wer = (int)mDisplayWnd.Display(mBuffer, wer);


                    main_ptr = (IntPtr)mBuffer.GetMarshalledBuffer();


                    Marshal.Copy(main_ptr, mainbyte, 0, 720 * 2560);
             }
        }
    }
    private void button7_Click(object sender, EventArgs e)
    {
        IntPtr dd = (IntPtr)mBuffer.GetMarshalledBuffer();
        Marshal.Copy(dd, main_byte1, 0, 720 * 2560);
        play = new VisiCam_Controller.DxPlay.DxPlay("", panel9, main_byte1);
        play.Start();


    }


    namespace DxPlay
    {
        public class DxPlay
        {
            public DxPlay(string sPath, Control hWin, byte[] color)
            {
                try
                {
                    // pick one of our image providers
                    //m_ImageHandler = new ImageFromFiles(sPath, 24);
                    m_ImageHandler = new ImageFromPixels(20, color);
                    //m_ImageHandler = new ImageFromMpg(@"c:\c1.mpg");
                    //m_ImageHandler = new ImageFromMpg(sPath);
                    //m_ImageHandler = new ImageFromMP3(@"c:\vss\media\track3.mp3");

                // Set up the graph
                    SetupGraph(hWin);
                }
                catch
                {
                    Dispose();
                    throw;
                }
            }
        }
        abstract internal class imagehandler
        internal class imagefrompixels
        {
            private int[] mainint = new int[720 * 1280];
            unsafe public ImageFromPixels(long FPS, byte[] x)
            {
                long fff = 720 * 1280 * 3;
                mainptr = new IntPtr(fff);
                for (int p = 0; p < 720 * 640; p++)
                {
                    U = (x[ p * 4 + 0]);

                    Y = (x[p * 4 + 1]);
                    V = (x[p * 4 + 2]);
                    Y2 = (x[p * 4 + 3]);

                    int one = V << 16 | Y << 8 | U;
                    int two = V << 16 | Y2 << 8 | U;
                    mainint[p * 2 + 0] = one;
                    mainint[p * 2 + 1] = two;

                }

                m_FPS = UNIT / FPS;
                m_b = 211;
                m_g = 197;
            }
        }
    }
}

Theres also GetImage but thats relatively the same, copy the buffer into the pointer. What happens is i grab a buffer of the image and send it to the DxPlay class. it is able to process it and put it on the directshow line no problems; but it never updates nor gets updated because its just a single buffer. If i instead send DxPlay a IntPtr holding the address of the image buffer, the code crashes for accessing memory because i assume ImageFromPixels code ( which isn't there now ( change

(x[p * 4 + #]) 

to

(IntPtr)((x-passed as an IntPtr).toInt64()+p*4 + #)

)) is accessing the memory of the pointer as the Cam_Controller class is editing it. I make and pass copies of the IntPtrs, and new IntPtrs but they fail halfway through the conversion.

Grant
  • 684
  • 1
  • 7
  • 26
  • Unfortunately its not that trivial - each source filter is unique at least by the type and number of output pins it supports. You might have to look for or implement a fake source filter. – BrokenGlass Aug 10 '11 at 20:46
  • Grant: I think you are not using the 'namespace' word correctly. Namespaces has everything to do with arranging code logically, but not much to do with pointers. Can you update your question with some of the provider (Webcam) en consumer (GSSF) code? – Dennis Smit Sep 07 '11 at 00:19
  • i have started fixing the code and i am getting close. i can pass buffers between the cam capture and a single directshow line – Grant Sep 07 '11 at 19:33
  • It works now. for the moment. I set a method of the buffer from the GSSF inside the thread that grabs the camera buffers and i copy the camera buffer to the gssf method buffer and when the GetImage fires each cycle, the buffer has the image in it – Grant Sep 07 '11 at 23:23

1 Answers1

5

If you want to do this in .NET, the following steps are needed:

  1. Use the DirectShow.NET Generic Sample Source Filter (GSSF.AX) from the Misc/GSSF directory within the sample package. A source filter is always a COM module, so you need to register it too using "RegSvr32 GSSF.ax".

  2. Implement a bitmap provider in .NET

  3. Setup a graph, and connect the pin from the GSSF to the implementation of the bitmap provider.

  4. Pray.

I am using the following within a project, and made it reusable for future usage.

The code (not the best, and not finished, but a working start) (this takes a IVideoSource, which is bellow):

public class VideoSourceToVideo : IDisposable
{
    object locker = new object();

    public event EventHandler<EventArgs> Starting;
    public event EventHandler<EventArgs> Stopping;
    public event EventHandler<EventArgs> Completed;

    /// <summary> graph builder interface. </summary>
    private DirectShowLib.ICaptureGraphBuilder2 captureGraphBuilder = null;
    DirectShowLib.IMediaControl mediaCtrl = null;
    IMediaEvent mediaEvent = null;
    bool stopMediaEventLoop = false;
    Thread mediaEventThread;

    /// <summary> Dimensions of the image, calculated once in constructor. </summary>
    private readonly VideoInfoHeader videoInfoHeader;

    IVideoSource source;

    public VideoSourceToVideo(IVideoSource source, string destFilename, string encoderName)
    {
        try
        {
            this.source = source;

            // Set up the capture graph
            SetupGraph(destFilename, encoderName);
        }
        catch
        {
            Dispose();
            throw;
        }
    }


    /// <summary> release everything. </summary>
    public void Dispose()
    {
        StopMediaEventLoop();
        CloseInterfaces();
    }

    /// <summary> build the capture graph for grabber. </summary>
    private void SetupGraph(string destFilename, string encoderName)
    {
        int hr;

        // Get the graphbuilder object
        captureGraphBuilder = new DirectShowLib.CaptureGraphBuilder2() as DirectShowLib.ICaptureGraphBuilder2;

        IFilterGraph2 filterGraph = new DirectShowLib.FilterGraph() as DirectShowLib.IFilterGraph2;

        mediaCtrl = filterGraph as DirectShowLib.IMediaControl;
        IMediaFilter mediaFilt = filterGraph as IMediaFilter;
        mediaEvent = filterGraph as IMediaEvent;



        captureGraphBuilder.SetFiltergraph(filterGraph);

        IBaseFilter aviMux;
        IFileSinkFilter fileSink = null;
        hr = captureGraphBuilder.SetOutputFileName(MediaSubType.Avi, destFilename, out aviMux, out fileSink);
        DsError.ThrowExceptionForHR(hr);

        DirectShowLib.IBaseFilter compressor = DirectShowUtils.GetVideoCompressor(encoderName);

        if (compressor == null)
        {
            throw new InvalidCodecException(encoderName);
        }


        hr = filterGraph.AddFilter(compressor, "compressor");
        DsError.ThrowExceptionForHR(hr);


        // Our data source
        IBaseFilter source = (IBaseFilter)new GenericSampleSourceFilter();

        // Get the pin from the filter so we can configure it
        IPin ipin = DsFindPin.ByDirection(source, PinDirection.Output, 0);

        try
        {
            // Configure the pin using the provided BitmapInfo
            ConfigurePusher((IGenericSampleConfig)ipin);
        }
        finally
        {
            Marshal.ReleaseComObject(ipin);
        }

        // Add the filter to the graph
        hr = filterGraph.AddFilter(source, "GenericSampleSourceFilter");
        Marshal.ThrowExceptionForHR(hr);


        hr = filterGraph.AddFilter(source, "source");
        DsError.ThrowExceptionForHR(hr);

        hr = captureGraphBuilder.RenderStream(null, null, source, compressor, aviMux);
        DsError.ThrowExceptionForHR(hr);

        IMediaPosition mediaPos = filterGraph as IMediaPosition;

        hr = mediaCtrl.Run();
        DsError.ThrowExceptionForHR(hr);
    }

    private void ConfigurePusher(IGenericSampleConfig ips)
    {
        int hr;

        source.SetMediaType(ips);

        // Specify the callback routine to call with each sample
        hr = ips.SetBitmapCB(source);
        DsError.ThrowExceptionForHR(hr);
    }


    private void StartMediaEventLoop()
    {
        mediaEventThread = new Thread(MediaEventLoop)
        {
            Name = "Offscreen Vid Player Medialoop",
            IsBackground = false
        };

        mediaEventThread.Start();
    }

    private void StopMediaEventLoop()
    {
        stopMediaEventLoop = true;

        if (mediaEventThread != null)
        {
            mediaEventThread.Join();
        }
    }

    public void MediaEventLoop()
    {
        MediaEventLoop(x => PercentageCompleted = x);
    }

    public double PercentageCompleted
    {
        get;
        private set;
    }

    // FIXME this needs some work, to be completely in-tune with needs.
    public void MediaEventLoop(Action<double> UpdateProgress)
    {
        mediaEvent.CancelDefaultHandling(EventCode.StateChange);
        //mediaEvent.CancelDefaultHandling(EventCode.Starvation);

        while (stopMediaEventLoop == false)
        {
            try
            {
                EventCode ev;

                IntPtr p1, p2;
                if (mediaEvent.GetEvent(out ev, out p1, out p2, 0) == 0)
                {
                    switch (ev)
                    {
                        case EventCode.Complete:
                            Stopping.Fire(this, null);
                            if (UpdateProgress != null)
                            {
                                UpdateProgress(source.PercentageCompleted);
                            }
                            return;


                        case EventCode.StateChange:
                            FilterState state = (FilterState)p1.ToInt32();

                            if (state == FilterState.Stopped || state == FilterState.Paused)
                            {
                                Stopping.Fire(this, null);
                            }
                            else if (state == FilterState.Running)
                            {
                                Starting.Fire(this, null);
                            }

                            break;

                        // FIXME add abort and stuff, and propagate this.
                    }

                    //                        Trace.WriteLine(ev.ToString() + " " + p1.ToInt32());

                    mediaEvent.FreeEventParams(ev, p1, p2);
                }
                else
                {
                    if (UpdateProgress != null)
                    {
                        UpdateProgress(source.PercentageCompleted);
                    }
                    // FiXME use AutoResetEvent
                    Thread.Sleep(100);
                }
            }
            catch (Exception e)
            {
                Trace.WriteLine("MediaEventLoop: " + e);
            }
        }
    }

    /// <summary> Shut down capture </summary>
    private void CloseInterfaces()
    {
        int hr;

        try
        {
            if (mediaCtrl != null)
            {
                // Stop the graph
                hr = mediaCtrl.Stop();
                mediaCtrl = null;
            }
        }
        catch (Exception ex)
        {
            Debug.WriteLine(ex);
        }

        if (captureGraphBuilder != null)
        {
            Marshal.ReleaseComObject(captureGraphBuilder);
            captureGraphBuilder = null;
        }

        GC.Collect();
    }

    public void Start()
    {
        StartMediaEventLoop();
    }
}

IVideoSource:

public interface IVideoSource : IGenericSampleCB
{
    double PercentageCompleted { get; }
    int GetImage(int iFrameNumber, IntPtr ip, int iSize, out int iRead);
    void SetMediaType(global::IPerform.Video.Conversion.Interops.IGenericSampleConfig psc);
    int SetTimeStamps(global::DirectShowLib.IMediaSample pSample, int iFrameNumber);
}

ImageVideoSource (mostly taken from DirectShow.NET examples):

    // A generic class to support easily changing between my different sources of data.

// Note: You DON'T have to use this class, or anything like it.  The key is the SampleCallback
// routine.  How/where you get your bitmaps is ENTIRELY up to you.  Having SampleCallback call
// members of this class was just the approach I used to isolate the data handling.
public abstract class ImageVideoSource : IDisposable, IVideoSource
{
    #region Definitions

    /// <summary>
    /// 100 ns - used by a number of DS methods
    /// </summary>
    private const long UNIT = 10000000;

    #endregion

    /// <summary>
    /// Number of callbacks that returned a positive result
    /// </summary>
    private int m_iFrameNumber = 0;

    virtual public void Dispose()
    {
    }

    public abstract double PercentageCompleted { get; protected set; }

    abstract public void SetMediaType(IGenericSampleConfig psc);
    abstract public int GetImage(int iFrameNumber, IntPtr ip, int iSize, out int iRead);
    virtual public int SetTimeStamps(IMediaSample pSample, int iFrameNumber)
    {
        return 0;
    }

    /// <summary>
    /// Called by the GenericSampleSourceFilter.  This routine populates the MediaSample.
    /// </summary>
    /// <param name="pSample">Pointer to a sample</param>
    /// <returns>0 = success, 1 = end of stream, negative values for errors</returns>
    virtual public int SampleCallback(IMediaSample pSample)
    {
        int hr;
        IntPtr pData;

        try
        {
            // Get the buffer into which we will copy the data
            hr = pSample.GetPointer(out pData);
            if (hr >= 0)
            {
                // Set TRUE on every sample for uncompressed frames
                hr = pSample.SetSyncPoint(true);
                if (hr >= 0)
                {
                    // Find out the amount of space in the buffer
                    int cbData = pSample.GetSize();

                    hr = SetTimeStamps(pSample, m_iFrameNumber);
                    if (hr >= 0)
                    {
                        int iRead;

                        // Get copy the data into the sample
                        hr = GetImage(m_iFrameNumber, pData, cbData, out iRead);
                        if (hr == 0) // 1 == End of stream
                        {
                            pSample.SetActualDataLength(iRead);

                            // increment the frame number for next time
                            m_iFrameNumber++;
                        }
                    }
                }
            }
        }
        finally
        {
            // Release our pointer the the media sample.  THIS IS ESSENTIAL!  If
            // you don't do this, the graph will stop after about 2 samples.
            Marshal.ReleaseComObject(pSample);
        }

        return hr;
    }
}

RawVideoSource (an example of a concrete managed source generator for a DirectShow pipeline):

    internal class RawVideoSource : ImageVideoSource
{ 
    private byte[] buffer;
    private byte[] demosaicBuffer;
    private RawVideoReader reader;

    public override double PercentageCompleted
    {
        get;
        protected set;
    }

    public RawVideoSource(string sourceFile)
    {
        reader = new RawVideoReader(sourceFile);
    }

    override public void SetMediaType(IGenericSampleConfig psc)
    {
        BitmapInfoHeader bmi = new BitmapInfoHeader();

        bmi.Size = Marshal.SizeOf(typeof(BitmapInfoHeader));
        bmi.Width = reader.Header.VideoSize.Width;
        bmi.Height = reader.Header.VideoSize.Height;
        bmi.Planes = 1;
        bmi.BitCount = 24;
        bmi.Compression = 0;
        bmi.ImageSize = (bmi.BitCount / 8) * bmi.Width * bmi.Height;
        bmi.XPelsPerMeter = 0;
        bmi.YPelsPerMeter = 0;
        bmi.ClrUsed = 0;
        bmi.ClrImportant = 0;

        int hr = psc.SetMediaTypeFromBitmap(bmi, 0);

        buffer = new byte[reader.Header.FrameSize];
        demosaicBuffer = new byte[reader.Header.FrameSize * 3];

        DsError.ThrowExceptionForHR(hr);
    }

    long startFrameTime;
    long endFrameTime;
    unsafe override public int GetImage(int iFrameNumber, IntPtr ip, int iSize, out int iRead)
    {
        int hr = 0;

        if (iFrameNumber < reader.Header.NumberOfFrames)
        {
            reader.ReadFrame(buffer, iFrameNumber, out startFrameTime, out endFrameTime);

            Demosaic.DemosaicGBGR24Bilinear(buffer, demosaicBuffer, reader.Header.VideoSize);

            Marshal.Copy(demosaicBuffer, 0, ip, reader.Header.FrameSize * 3);

            PercentageCompleted = ((double)iFrameNumber / reader.Header.NumberOfFrames) * 100.0;
        }
        else
        {
            PercentageCompleted = 100;

            hr = 1; // End of stream
        }

        iRead = iSize;

        return hr;
    }

    override public int SetTimeStamps(IMediaSample pSample, int iFrameNumber)
    {
        reader.ReadTimeStamps(iFrameNumber, out startFrameTime, out endFrameTime);

        DsLong rtStart = new DsLong(startFrameTime);
        DsLong rtStop = new DsLong(endFrameTime);

        int hr = pSample.SetTime(rtStart, rtStop);

        return hr;
    }
}

And the interops to the GSSF.AX COM:

namespace IPerform.Video.Conversion.Interops
{
    [ComImport, Guid("6F7BCF72-D0C2-4449-BE0E-B12F580D056D")]
    public class GenericSampleSourceFilter
    {
    }

    [InterfaceType(ComInterfaceType.InterfaceIsIUnknown),
    Guid("33B9EE57-1067-45fa-B12D-C37517F09FC0")]
    public interface IGenericSampleCB
    {
        [PreserveSig]
        int SampleCallback(IMediaSample pSample);
    }

    [Guid("CE50FFF9-1BA8-4788-8131-BDE7D4FFC27F"),
    InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
    public interface IGenericSampleConfig
    {
        [PreserveSig]
        int SetMediaTypeFromBitmap(BitmapInfoHeader bmi, long lFPS);

        [PreserveSig]
        int SetMediaType([MarshalAs(UnmanagedType.LPStruct)] AMMediaType amt);

        [PreserveSig]
        int SetMediaTypeEx([MarshalAs(UnmanagedType.LPStruct)] AMMediaType amt, int lBufferSize);

        [PreserveSig]
        int SetBitmapCB(IGenericSampleCB pfn);
    }
}

Good luck, try to get it working using this. Or comment with further questions so we can iron out other issues.

Dennis Smit
  • 1,168
  • 8
  • 12
  • What is RawVideoReader? dont know where thats defined at or what it is – Grant Sep 01 '11 at 23:17
  • I have gotten GSSF to work finally. i have a buffer of rgb and i have it able to display them. so im stoked and working form there – Grant Sep 01 '11 at 23:51
  • RawVideoReader is an IImageVideoSource implementation I use for a project of mine. What steps do you miss right now, maybe I can provide some advice? – Dennis Smit Sep 03 '11 at 17:29
  • If you have more questions, please ask, so we can make this question / answer as complete as possible for future references when other people walk into the same issues! – Dennis Smit Sep 06 '11 at 22:19