This is a vastly simplified example of the issue I'm running into, but given a trait Thing
which implements Ord
, and a struct Object
which implements Thing
, I have the following struct:
pub struct MyStruct<'a> {
my_things: HashMap<i32, Vec<Box<dyn Thing + 'a>>>
}
impl<'a> MyStruct<'a> {
pub fn new() -> MyStruct<'a> {
MyStruct {
my_things: HashMap::new()
}
}
pub fn add_object(&mut self, key: i32, obj: Object) {
if !self.my_things.contains_key(&key) {
self.my_things.insert(key, Vec::new());
}
let new_thing: Box<dyn Thing> = Box::new(obj);
let things = self.my_things.get_mut(&key).unwrap();
things.push(new_thing);
things.sort();
}
}
It essentially takes a key and an Object
, and adds the object to a HashMap
of Vec
s using the given key. I know this isn't the optimal way to do this but I wanted to keep it simpler for illustration.
The compiler complains on the call to things.sort()
with the following error:
error[E0308]: mismatched types
--> src/main.rs:58:16
|
58 | things.sort();
| ^^^^ lifetime mismatch
|
= note: expected trait `Ord`
found trait `Ord`
note: the lifetime `'a` as defined on the impl at 42:6...
--> src/main.rs:42:6
|
42 | impl<'a> MyStruct<'a> {
| ^^
= note: ...does not necessarily outlive the static lifetime
If I remove all of the 'a
lifetimes in this example, the code will compile. But for my actual use case, I need to allow non-static lifetimes.
Can someone explain what's going on here? Does sort()
really require the Vec
to contain items with static lifetimes? If so why?
Is there a good workaround?