… rather than minimizing change amount.
This is more of a discussion issue, to see if this is appropriate. I haven't tested it (and indeed, it's going to break all coin selection tests).
The current coin selection code is a bit weird and already aggressively spends small inputs, as if you are sending N, it first tries to only use inputs less than N. (Actually, I've commented it all out in the version of code I run, as I've found it to be rather harmful). But anyway ApproximateBestSubset is only used when the it can't source N from inputs <= N -- so I think it makes sense to use a totally different algorithm.
And instead of trying to minimize the change (which is a wet dream for wallet clustering services), I think it makes sense to instead try optimize for the amount of inputs you need to use.
--
And in a more radical note:
I think it's really important to have a bunch of different coin selection algorithms to pick from. One-size fits all is pretty painful. When I'm sending transactions at 200satoshi/byte, I really, really don't want to consolidating all my unspents and don't care if I'm grinding dust outputs. However, when when I'm sending an ultra-low priority transaction (e.g. sending money from my hot wallet to cold storage) and only paying 5 sat/byte, the more utxo consolidation the better!