If is it a closed system and a random process, it is unlikely to have the same entropy, it will likely have more entropy, and it will never ever have less.
No. Entropy is a state function. Random, non-random, deterministic, chaotic, quantum, spooky, etc., all processes lead to the same entropy for the same state. This is what Sewell seems to be missing in his appendix.
I stated the second law of thermodynamics and you said "No"?
Of course entropy is a state function. It is a measure of the thermodynamic state at a particular point in time. If time x occurs before time y, the state at time x will always be have less or equal entropy to the state of entropy at time y, for a closed system.
"all processes lead to the same entropy for the same state"
The state changes with time. That is like saying the entropy at time x is the same as the entropy time x. Maybe all of your systems are assumed to already be at equlibrium, but even then the statement is a strech. Maybe you have invented a way to stop time (travel the speed of light), but I doubt it.
Perhaps I'm not being clear. The entropy of a system in a given state does not depend on the path the system takes to get to that state.
A system may proceed from state (T1,V1) to (T2,V2) by reversible, random, irreversible, deterministic or any other process. The entropy at (T2,V2) will be the same no matter what path is taken.