-
Notifications
You must be signed in to change notification settings - Fork 205
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New ADC V2 with Async support #814
base: master
Are you sure you want to change the base?
Conversation
Automatically convert pins to correct mode in Channel::with_pins
`wait_flags` and buffered read methods
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just finished a first pass at reviewing this. In general, I would say this is an excellent implementation. Given your willingness to learn best practices, my review is more detailed than usual, so feel free to ignore some of these comments if you feel they're too nitpicky.
That being said, I do have some other comments that I feel deserve some attention:
- SAMD11 support. I feel we're so close it makes sense to finish the implementation for SAMD11 chips as well. As far as I can tell, the ADC is exactly the same between D21 and D11 targets. The only thing that might change and that I'm unsure of, is the maximum supported clock speed.
- We've already discussed this a little, but I'm not so sure the
Channel
API is so necessary after all. When I initially suggested this, I was under the impression that the ADC could perform conversions simultaneously on multiple channels, similar to EIC or DAMC. That's obviously not the case. As far as I can tell, there is also a 1:1 relationship between ADC channels and physical pins. So while the ADC can mux between channels, there is no pin-channel muxing going on. I think the ADC could simply take&mut
refs to already-configured pins instead. This would cut down on complexity a whole bunch. - Before merging, the Tier 1 BSP examples will have to be updated, and clippy warnings resolved. Ideally we would show examples where the CPU clock is ran at full speed, and a second GCLK is started at a slower speed to clock the ADC.
- I like that the d51 implementation takes advantage of the v2 clock system. However, it seems like the new adc module is not taking fully advantage of the typesafe clocking system: one needs to (unsafely) steal the peripherals to create a
MCLK
instance in order to configure the ADC. We should instead use the clocks system in its intended way, which is to exchange anApbToken<Adcx>
usingApb::enable
to get a configured clock type, which can then be passed to theAdc
peripheral, thus proving the clock is enabled. This is somewhat inverse to what the module is currently doing, which is enabling the ADC clock inside the constructor using the MCLK peripheral.
self.sync(); | ||
self.adc.inputctrl().modify(|_, w| { | ||
w.muxneg().gnd(); | ||
w.gain().variant(Gainselect::Div2) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I know this was in the previous ADC implementation, but do we know why a gain of 1/2 is necessary here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure, but when testing, it appears that gain of 1x actually seems to provide a 2x amplification of the output voltage reported, and 1/2 outputs 1x. Ill need to read the data sheet more as to why this might be the case, maybe something else needs configuring?
Here's a proof of concept of what I mean by using the typesafe clocking system. It's still untested and needs some more feature gating to accomodate D11 and D21 targets, but I think it gets the idea across: https://github.com/jbeaurivage/atsamd/tree/adc-v2-clocking With an example usage: let pins = Pins::new(cx.device.port);
let (buses, clocks, tokens) = clock_system_at_reset(
cx.device.oscctrl,
cx.device.osc32kctrl,
cx.device.gclk,
cx.device.mclk,
&mut cx.device.nvmctrl,
);
let mut apb = buses.apb;
let adc0_clk = apb.enable(tokens.apbs.adc0);
let adc0_settings = Config::new()
.clock_cycles_per_sample(5)
.clock_divider(Prescalerselect::Div2)
.sample_resolution(AdcResolution::_12bit)
.accumulation_method(AdcAccumulation::Single);
let gclk0 = clocks.gclk0;
let (pclk_adc0, gclk0) = Pclk::enable(tokens.pclks.adc0, gclk0);
let (_pclk_adc1, _gclk0) = Pclk::enable(tokens.pclks.adc1, gclk0);
let (adc0, channels_adc0) =
Adc::new(cx.device.adc0, adc0_settings, adc0_clk, pclk_adc0.into()).unwrap(); |
@rnd-ash, your additional commits look good. You can check out This PR, which implements some of the larger changes mentioned here I also want to point out that when testing on a SAMD51 board (metro M4), I regularly get the ADC stalling on waiting for the RESRDY flag to get set. So far the only way I managed to get ADC readings was:
To me this sounds like there is still a clock synchronization issue or a race condition somewhere. So, among the things left to do:
|
ADC v2: Remove Channel machinery, fully use `v2` clocks
ADC improvements (continued)
Update T1 BSPs and examples
Summary
This PR adds a new ADC API, based around a similar system to channels used by EIC and DMA peripherals.
At the moment I'll keep this as a draft PR for feedback whilst I am still finalizing bits and also adding SAMD11/21 support
Features
Progress checklist