Answered by:
GPIO Drivers

Question
-
Is the GPIO driver functionality available on desktop Windows 10
In the artical https://msdn.microsoft.com/en-us/library/windows/hardware/hh439508(v=vs.85).aspx it suggests that it is but I have not been able to make it work even after installing a seemingly working GPIO driver I get file not found exception during GpioController.GetDefault
Tuesday, September 22, 2015 12:38 PM
Answers
-
Yes I have included a ref to the IOT extension.
I have some new info though, my working board is a MinnowMax (32 bit Windows IOT) my test board has 64 bit Windows 10 desktop, if I rebuild my test app in 64bit I get NULL returned from GetDefault which is what I would expect if the actual GPIO driver wasn't working which is probably the case as it's just an experiment at the moment.
I will investigate further but for now I think the file not found exception is answered, thanks everyone.
- Marked as answer by Mark Cullen1 Tuesday, September 22, 2015 2:28 PM
Tuesday, September 22, 2015 2:28 PM -
The APIs themselves are present in all editions of Windows (Desktop, Mobile, IoT Core), but the driver (rhproxy.sys) that proxies GPIO resources from the Windows GPIO Driver Model (GpioClx) to usermode is present only on IoT Core. You will be able to call GpioController.GetDefault() on any SKU, but it will return NULL on every SKU except IoT Core.
- Marked as answer by Marcus Russell [MSFT] Thursday, November 19, 2015 4:46 PM
Friday, September 25, 2015 9:46 PM
All replies
-
What device is it running on? Does it have any GPIO? If you somehow managed to install GPIO drivers on your desktop PC (which it sounds like you did) then there wouldn't be any GPIO hardware for the drivers to interact with.Tuesday, September 22, 2015 12:46 PM
-
As there is no GPIO hardware on the PC - desktop Windows 10, no.
- Edited by riclh Tuesday, September 22, 2015 12:58 PM
- Proposed as answer by riclh Tuesday, September 22, 2015 12:58 PM
- Unproposed as answer by Mark Cullen1 Tuesday, September 22, 2015 1:14 PM
Tuesday, September 22, 2015 12:57 PM -
I work for an Embedded PC manufacturer and I'm working on one of our own board and it does have GPIO hardware.
Tuesday, September 22, 2015 1:09 PM -
Have you included a reference to
Windows IoT Extension SDK
in your manifest?
Tuesday, September 22, 2015 2:09 PM -
Is the board using the same processor (or processor family Intel E3800) as the MinnowBoard Max? You would have to modify the GPIO driver to adapt to hardware.
Sean Liming - Book Author: Starter Guide SIM (WEI), Pro Guide to WE8S & WES 7, Pro Guide to POS for .NET - www.annabooks.com / www.seanliming.com
Tuesday, September 22, 2015 2:20 PM -
Yes I have included a ref to the IOT extension.
I have some new info though, my working board is a MinnowMax (32 bit Windows IOT) my test board has 64 bit Windows 10 desktop, if I rebuild my test app in 64bit I get NULL returned from GetDefault which is what I would expect if the actual GPIO driver wasn't working which is probably the case as it's just an experiment at the moment.
I will investigate further but for now I think the file not found exception is answered, thanks everyone.
- Marked as answer by Mark Cullen1 Tuesday, September 22, 2015 2:28 PM
Tuesday, September 22, 2015 2:28 PM -
The APIs themselves are present in all editions of Windows (Desktop, Mobile, IoT Core), but the driver (rhproxy.sys) that proxies GPIO resources from the Windows GPIO Driver Model (GpioClx) to usermode is present only on IoT Core. You will be able to call GpioController.GetDefault() on any SKU, but it will return NULL on every SKU except IoT Core.
- Marked as answer by Marcus Russell [MSFT] Thursday, November 19, 2015 4:46 PM
Friday, September 25, 2015 9:46 PM -
Hi Jordan
Thanks for that update.
I would like to suggest that Microsoft consider either including the system level GPIO components in desktop SKUs or allowing them to be installed in some way. As an embedded PC manufacturer we are often asked to provide larger scale systems with desktop versions of Windows that still require GPIO functionality. At the moment we provide this with bespoke APIs but it would be much better to use the Microsoft ones.
Monday, September 28, 2015 1:43 PM -
Do you know if there is any way to install the drivers IoT uses on Windows 10 Desktop?
I could install the rhproxy driver i found in the system32\DriverStore\FileRepository directory, but after the installation it says that this device is waiting for other devices to be ready. It gives me a dependency list of other devices, but all the other devices have installed driver (i belive those are not the correct drivers). The dependecies are the GPIO controller, SPI controller, I2C controller.
Can someone help me to find the correct drivers for these also? By correct drivers I mean the drivers IoT uses for MinnowBoardMax.
Thanks!
Friday, October 30, 2015 3:59 PM -
Hello Tomikapc,
Jordan's summary above is correct. The components you mentioned can be installed on other editions and SKUs, but they will return null on platforms other than IoT Core.
Thursday, November 19, 2015 5:01 PM -
Good to know that GpioClx is still there on desktop, because I am currently implementing my own direct driver with user mode access. Because the Microsoft driver doesn't provide any way to get precise timings of inputs (fires changed events without a timestamp) which makes protocol decoding impossible at high frequencies.
Well that's the plan anyway. My particular goal is protocol decoding over GPIO, not to make it available the desktop. But so long as the API is flexible (my preference) it should work.
But what hardware device do you have in the first place, which provides a GPIO controller to the PC which the GpioClx already recognizes? I'd like to test it.
Key Artefacts
Thursday, November 19, 2015 7:36 PM -
Good to know that GpioClx is still there on desktop, because I am currently implementing my own direct driver with user mode access. Because the Microsoft driver doesn't provide any way to get precise timings of inputs (fires changed events without a timestamp) which makes protocol decoding impossible at high frequencies.
Well that's the plan anyway. My particular goal is protocol decoding over GPIO, not to make it available the desktop. But so long as the API is flexible (my preference) it should work.
But what hardware device do you have in the first place, which provides a GPIO controller to the PC which the GpioClx already recognizes? I'd like to test it.
Key Artefacts
That is an excellent point about the absence of timestamps on the ValueChanged events. I hope someone at Microsoft is paying attention to that because it is a serious hole in the API (compare to the .NET Micro Framework class InterruptPort, which provides time stamped events). Timing is a seriously important requirement for embedded systems, and so far the Win IoT Core fails to provide anything to address this - none of the CPU counter/timer block is exposed and events are not time stamped, which makes it pretty useless for anything even remotely 'real time'.
Using a Netduino and .NET Micro Framework, I have successfully measured a 1 KHz signal (square wave input on an input pin configured as an InterruptPort). So currently with its single CPU and interpreted MSIL, Netduino outperforms the quad-core AOT compiled Raspberry Pi 2 by 30 times. WAT? On the raspberry pi, the best resolution I can get is about 30 milliseconds, 30 times slower than a Netduino. Crazy!
Tim Long
Friday, November 20, 2015 11:34 PM -
Well https://github.com/ms-iot/samples/blob/develop/GpioOneWire/MainPage.xaml.cpp is sampling a OneWire GPIO connection at microseconds level,
// A '0' has a pulse time of 76 microseconds, while a '1' has a // pulse time of 120 microseconds. 110 is chosen as a reasonable threshold.
but that might be wrong?
riclh
- Edited by riclh Saturday, November 21, 2015 12:31 AM
Saturday, November 21, 2015 12:15 AM -
That's not a solution it's a one time spin-lock to read a short sample.
We're talking about real background work, an ongoing thread which serves some useful purpose. Not a dumb sample app. Not a real world solution!
Microsoft already did the hard work to provide an event driven model with their driver doing the low level wire tapping. They just forgot to add one property (timestamp) to the event class and populate it with the content of the high resolution timer. A very simple change indeed which would bring with it many capabilities for us all.
Currently if you want to measure pulse lengths you have to subscribe for the event then quickly try to read the stopwatch to guess when it occurred. That's always in-accurate of course because you're measuring the time after the fact. Totally wrong way around and duplicated effort, but there is no choice because MS did not add the timestamp.
The other issue is they did a pretty good job of hiding all capability to make time critical threads in all but C++ languages. So even if most developers could get the timestamp in our C# and other languages they would then have issues that the data would get garbled whenever the GUI or other background tasks were busy. Not fit for "real-time/embedded" work at all!
So if you want to do this sort of thing, first you must write C++ code to get access to high priority threads, then you must still guess the timestamp and accept the error rate on your samples (timestamp after the fact). Not good, forget high resolution stuff. Or you do like that sample and spin the processor to 100% to get the high speed PWM but then have no CPU time left to do anything useful with.
That's why I'm writing a driver for my needs. But it's over the top for most people and I usually prefer to build on standard APIs. MS could really help a lot if they addressed these two blocking issues. In order to be successful in this space they must demonstrate some real-world, real-time capabilities.
- Edited by Code Chief Saturday, November 21, 2015 4:17 PM
Saturday, November 21, 2015 4:14 PM -
Have you looked at https://github.com/ms-iot/samples/blob/master/GpioInterrupt/CS/GpioInterrupt/StartupTask.cs?
Modifying at bit to get some timing information and, if required, create a hardware event queue.
namespace GPIOEventqueue
{ public class HardwareEvent { public long ticks; public GpioPinEdge edge; public HardwareEvent(long ticks, GpioPinEdge edge) { this.ticks = ticks; this.edge = edge; } } /// <summary> /// An empty page that can be used on its own or navigated to within a Frame. /// </summary> public sealed partial class MainPage : Page { const int GPIO_INPUT_PIN = 5; const int SAMPLE_SIZE = 1000; private GpioPin pin = null; private Stopwatch stopwatch = null; private bool capturing = false; private List<HardwareEvent> Edges; //private Queue<HardwareEvent> Edges; public MainPage() { InitializeComponent(); NavigationCacheMode = NavigationCacheMode.Enabled; stopwatch = Stopwatch.StartNew(); //Debug.WriteLine(" Timer frequency in ticks per second = {0}", Stopwatch.Frequency); //Debug.WriteLine(" Timer is accurate within {0} nanoseconds", (1000L * 1000L * 1000L) / Stopwatch.Frequency); // Timer frequency in ticks per second = 19200000 // Timer is accurate within 52 nanoseconds SetupGpio(); Edges = new List<HardwareEvent>(); //Edges = new Queue<HardwareEvent>(); Loaded += (sender, e) => { if (pin != null) pin.ValueChanged += Pin_ValueChanged; }; Unloaded += (sender, e) => { if (pin != null) pin.ValueChanged -= Pin_ValueChanged; }; }
private void SetupGpio()
{
pin = null;try
{
GpioController gpio = GpioController.GetDefault();if (gpio != null)
{
pin = gpio.OpenPin(GPIO_INPUT_PIN);if (pin != null)
{
pin.SetDriveMode(GpioPinDriveMode.Input);
}
}
}
catch(Exception ex)
{
Debug.WriteLine(ex.Message);
}
}private void Pin_ValueChanged(GpioPin sender, GpioPinValueChangedEventArgs args) { HardwareEvent he = new HardwareEvent(stopwatch.ElapsedTicks, args.Edge);
if (capturing) { Edges.Add(he); //Edges.Enqueue(he); if (Edges.Count >= SAMPLE_SIZE) { capturing = false; // just trigger the display to the screen, don't (a)wait for it to complete Dispatcher.RunAsync(CoreDispatcherPriority.Normal, () => { DisplayEdges(); }); } } }
private void CaptureButton_Click(object sender, RoutedEventArgs e)
{
Edges.Clear();
capturing = true;
}private void DisplayEdges() { // TODO } }
I would be interested at what frequency that tops out.
riclh
- Edited by riclh Saturday, November 21, 2015 8:35 PM
Saturday, November 21, 2015 6:26 PM -
Yes been there, done that.
The issue remains the same. The code here is guessing the timestamp after the fact, when it creates the HardwareEvent. The Microsoft driver is doing most the work providing the event, and sometime later after our normal/lower-priority GUI thread picks-up the event the timestamp is added.
I did a lot of testing with similar code I made myself before, trying to decode a CPPM. That's a protocol with up to 20ms batches with some 1-2ms cycles inside representing the data values. The result with the original RTM (inbox) driver was terrible. Less than 80% of the messages were understood (because they overshot the maximum time allowed for high/low) and the actual PWM values (position of RC sticks) were totally inaccurate much more than the natural analog background movement (jumping way up high and low not just fluctuating a bit).
Now that was CPPM, if you take SBus you have to do the same thing 3 times faster with 10 times shorter cycles = no chance.
The most sensible solution is for Microsoft to simply add a timestamp to their driver event. But even then we still need high priortiy/realtime thread scheduling capability, or a capture ring buffer implemented directly in the driver.
I tried to test it with the new "100 times faster" driver in Windows IoT Preview but then there were many issues with the new Visual Studio tools and RC SDKs so I gave up with that until the next proper RTM SDK. I'm not wasting time with previews anymore due to the massive miss-alignment of VS, SDK and platform releases (it just doesn't work) so I'm focusing on my own driver with the RTM build right now.
Key Artefacts
Wednesday, November 25, 2015 8:37 PM -
We'll look into adding a timestamp to the interrupt event. Due to overhead in the ISR path the timestamp will read about ~30us after the pin actually changed, and will be subject to jitter due to other ISRs. All interrupts are delivered to core 0 and with the USB controller firing 8000 interrupts/s, there will be considerable jitter.
- Proposed as answer by riclh Wednesday, December 16, 2015 9:44 PM
Monday, December 7, 2015 8:57 AM -
I think that not loosing edges is more important than exact absolute timings in any case. The fixed ISR overhead of ~30us just moves the recorded time the interrupt occurred slightly and could be accounted for if required. That is almost never as far as I am concerned as relative time between edges is usually more important than absolute time here.
Jitter (which as you note can be considerable) is always going to be a problem but missing edges will be a larger one I think.
riclh
- Edited by riclh Monday, December 7, 2015 12:37 PM
Monday, December 7, 2015 12:30 PM -
It has nothing to do with losing edges. The addition of the high resolution timer which is already present and frequently used in drivers will not cause any additional edge to miss, not beyond the current accuracy anyway.
I *know* the opposite is true, more edges are lost when we have to guess the timestamp after the event. Not only because we are too slow, but also because the data we receive is inaccurate and exceeds the tolerances of the protocol (e.g. CPPM must be <2ms per cycle with max 0.5ms low). The interrupt request service routine (in the driver and framework broker) has nothing to do with the overall delay, that is the fast part where we simply want to add the timestamp to the event data! Currently we have to do it in the slow (user mode) part!
The next point is about the average use case. Who is sampling millions of edges and is not interested in the time between pulses? What sort of data is that then, just high and low with no meaning in-between (useless)? PWM is all about duty cycle/lengths, that is how the data is communicated.
Forget the unrealistic "blinky" samples. Real world use of GPIO requires accurate timing, and it is absolutely simple to implement in the correct place, the C++ driver from Microsoft. To do it properly they should also implement a configurable ring buffer (but I'd be happy with the timestamp to start with). My sample with RC input shows the current IoT system is unable to do something extremely simple on embedded/realtime Linux (which MS should be competing with if they want to succeed).
What I am suggesting is really the minimum effort for maximum gain. Please stop side-tracking this issue, we need @MSFT attention here! It is clear multiple users (and common industry use cases) have requirements for GPIO timing, we don't need to discuss whether it is really a problem or not, we know it!
Key Artefacts
Wednesday, December 9, 2015 3:35 PM -
I think you miss the point. If the interrupt latency (because of other tasks/interrupts) for the GPIO interrupt falls beyond the period between three edges then an edge will/may be lost. At a latency timespan of two edges the effect is implementation dependent.
Jitter will also occur if the sampling time period is less that the distance between two edges and other tasks/interrupts get in the way of it being reported immediately.
Sampling frequency (maximum) is determined by the number of lines of code in the 'User' interrupt routine. The fewer the better.
The difference between capturing time at 'User' level and 'System' level is almost always going to be a fixed value (unless other Tasks/Interrupts occur during that very short time span/lines of code).
A Stopwatch has a resolution of 52nsecs between ticks, i.e the very highest counter./timer available.
The 'lost edges'/Jitter occurs underneath/alongside the GPIO driver layer. It is in the hardware itself and how it is realised in the OS along with everything else.
What I am trying to point out is that even if you get what you want, it will not solve the problems you have, that of running on a non-realtime OS which is what IoT (and normal Linux) is and wanting 0ns Jitter.
Note I said that Relative time between edges is what this all about. Absolute time is just used to calculate that normally.
YMMV.
riclh
- Edited by riclh Wednesday, December 9, 2015 4:37 PM
Wednesday, December 9, 2015 3:52 PM -
Well I did ask but you continue with the same side discussion. One thing you say is wrong, that user mode code and driver code will take the same time. That is the only requirement here, that the timestamp is captured efficiently in driver level code during it's hardware interrupt/timer that is NOT interrupted (just a few ns in total). There is NO way any user mode code could ever get the same timestamp at anywhere near the same difference.
Also regarding "real time", in the real world is nothing more than a cut-down OS which allows uninterrupted processes to run (and potentially hang) the system. Windows already has that, and I think that is why MS don't release a "real time" build of their OS as they always had the "realtime" process scheduling level.
You can break any OS, if you are silly enough to run too much code without yielding. That's why drivers are focused on specific tasks, and the rest in user mode, because normally the only reason to do "dangerous" (potentially hanging) operations is with hardware.
So here we have hardware, which naturally, in many (perhaps most useful) use cases needs timing. The ONLY correct place to be time stamping is in the driver.
I wonder why you are so concerned about this suggestion to spend so much time responding. Do you think the addition of the timestamp call in the driver will slow the whole thing down? Because the current accuracy is actually pretty poor compared to real time linux, which is no different (running with the same hardware) as the windows driver at the kernel level.
But fair enough, that is a valid point for use case acceptance. On the one side many people will gain more accurate (actually more like feasible vs useless) PWM decoding, and the people who want to blink LEDs won't notice any difference ;-)
Key Artefacts
Monday, December 14, 2015 10:16 PM -
I rather suspect that the latency/jitter before the GPIO driver even gets a chance to run and thus satisfy the interrupt is a lot bigger than you expect. There are over 8000 interrupts per sec for the USB alone (from above). The latency/jitter you will see is bounded by the length of code in those drivers as well as the GPIO ones and your user code. They are all running at core 0, so a shared time space.
That is why I think that even if you get what you want, it wont do what you wish (and what was hinted at above).
I am working on a method that will reliably get 20-40us resolution of any input signal on a pin, but it does not use GPIO.
riclh
- Edited by riclh Monday, December 14, 2015 10:58 PM
Monday, December 14, 2015 10:26 PM -
We'll look into adding a timestamp to the interrupt event. Due to overhead in the ISR path the timestamp will read about ~30us after the pin actually changed, and will be subject to jitter due to other ISRs. All interrupts are delivered to core 0 and with the USB controller firing 8000 interrupts/s, there will be considerable jitter.
I almost missed this one... yes thanks for considering it!
I understand the timing will never be perfect and if some USB device is busy there could be some packets which have to be discarded = acceptable. For CPPM we are measuring <= 500us low with 500-1500us high being the actual data of the stick/slider/switch position. Within this range the relatively small amount of jitter is acceptable (compared to what was observed in a multi-threaded user mode app trying to guess it after the ISR).
The general consensus appears to be that a software GPIO is never going to be fantastic. I didn't really expect (just hoped) to achieve things like SBUS/serial decoding, but even some "kind of reliable" CPPM should work and if Windows IoT can do it too that would be a great achievement!For RC input at least we can apply some statistical functions to smooth the control movements. It was nowhere near possible before.
- Edited by Code Chief Wednesday, December 16, 2015 6:18 AM
- Proposed as answer by riclh Wednesday, December 16, 2015 9:45 PM
Wednesday, December 16, 2015 6:18 AM