If you can deal with two-second accuracy, the MS-DOS timestamp format used 16 bits to hold the date (year-1980 as 7 bits, month as 4, day as 5) and 16 bits for the time (hour as five, minute as six, seconds as five). On a processor like the Arduino, it may be possible to write code that splits values across a 16-bit boundary, but I think code will be more efficient if you can avoid such a split (as MS-DOS did by accepting two-second accuracy).
Otherwise, as was noted in another answer, using a 32-bit number of seconds since some base time will often be more efficient than trying to keep track of things in "calendar format". If all you ever need to do is advance from one calendar-format date to the next, the code to do that may be simpler than code to convert between calendar dates and linear dates, but if you need to do much of anything else (even step backward from a date to the previous one) you'll likely be better off converting dates to/from linear format when they're input or displayed, and otherwise simply work with linear numbers of seconds.
Working with linear numbers of seconds can be made more convenient if you pick as a baseline date March 1 of a leap year. Then while the date exceeds 1461, subtract that from the date and add 4 to the year (16-bit comparison and subtraction are efficient on the Arduino, and even in 2040 the loop may still take less time than a single 16x16 division). If the date exceeds 364, subtract 365 and increment the year, and try that up to twice more [if the date is 365 after the third subtraction, leave it].
Some care is needed to ensure that all corner cases work correctly, but even on a little 8-bit or 16-bit micro, conversions can be surprisingly efficient.