mirror of
https://github.com/doctrine/orm.git
synced 2026-04-29 17:33:15 +02:00
Cache for AttributeReader mapping reader
#6929
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @boesing on GitHub (Feb 15, 2022).
Feature Request
Summary
Hey there,
I do actually miss the
AttributeReadercache (there was a cache for the annotations and that safed a hell lot of repeated parsings).Was there an active decision against such a cached reader?
Is this something to be of interest here?
Is reading from a filesystem cache more performant than using
ReflectionClasson entities?(Haven't made actual benchmarks but from my experience with
laminas-hydrator, I do know that using reflection is actually insanely slow)Would love to get some feedback here and if this is of interest, I'm happy to contribute that cached reader for attributes.
@derrabus commented on GitHub (Feb 15, 2022):
The cached annotations reader was reasonable because annotations had to be parsed in userland out of an elements doc block.
With attributes, parsing is done by the PHP interpreter itself. Our cache in this case should be PHP's own opcache. Are attributes performing measurably worse than Doctrine Annotations in your case?
@boesing commented on GitHub (Feb 15, 2022):
No, I haven't verified this yet (no benchmarks).
And I think, that it might be less problematic as every class metadata already contains the
ReflectionClassand thus the performance impact might not be that huge as I assume.I just realized performance issues when using
ReflectionClassalong withlaminas-hydrator(ReflectionHydrator) but that was due to the fact thatReflectionClasswas called multiple times even tho it was already instantiated in the past. Somehow, this was not cached by opcache (but maybe opcache was disabled in my debugging) and thus lead to performance gaps.Since the metadata is cached in-memory, the attribute parsing will only happen once and thus I think I would be fine without such a cache in-place.
@derrabus commented on GitHub (Feb 15, 2022):
All right. I'm closing this ticket for now. If you encounter actual performance issues that might be solved by introducing a cache to the reader, feel free to reopen.