I really don't see why you wouldn't expect to find cache-sensitivity in Python, or in some other similarly high-level language.
Python sequences are defined as "support[ing] efficient element access using integer indices" [1]. Python lists are sequences and thus must support random access. In practice that means the implementation is a (dynamically resizing) array allocated contiguously. That means spatial locality is relevant.
If the list type were defined simply as an iterable collection, with no requirement of constant-time random access, the definition would be abstract enough that an implementation might end up being something else than a contiguous array. But if you define the type so that it supports constant-time random access [2], you pretty much end up with cache-friendly sequential access as well.
If you don't define the list type as supporting random access, you also sacrifice asymptotic algorithmic efficiency for lookups by index. Any language that cares about efficiency at all separates collections that support efficient random access from those that don't. (For example, Python has lists and dicts/sets for different access patterns. The Java standard library has separate types for contiguous arrays/lists, hashtable-backed collections and tree-backed collections because the three have different algorithmic efficiencies for different access patterns. In practice it leads to different properties in terms of cache-friendliness as well.)
[1] https://docs.python.org/3/glossary.html#term-sequence
[2] As usual, the Python documentation is a bit vague on the details. It doesn't really say random access has to be constant-time, only that it has to be "efficient". So you might be able to have a non-constant time implementation such as an indexable skiplist while arguing that it's efficient, but you'd have to go out of your way to do that.