A couple of points have been missed in this discussion. Firstly, the
willingness to read content from a non-regular file is not specific to
performing the check on an open filehandle. Perl is perfectly willing
to read content from a non-regular file specified by name:
$ perl -lwe 'print -T "/dev/null" || 0; print -T "/dev/zero" || 0'
Secondly, there's been a bit of the mistaken idea that a non-regular
file is somehow "not a file". Looking at the documentation of the
file test operators, it's clearly been written using Unix terminology,
in which anything in the filesystem is a file:
# -f File is a plain file.
# -d File is a directory.
# -b File is a block special file.
Given that understanding of what "file" means in this document, we can
better interpret the documentation for -T and -B:
# -T File is an ASCII or UTF-8 text file (heuristic guess).
# -B File is a "binary" file (opposite of -T).
# The "-T" and "-B" switches work as follows. The first block or
# so of the file is examined to see if it is valid UTF-8 that
# includes non-ASCII characters. If, so it's a "-T" file.
It speaks of examining the content of "the file". Doesn't say anything
about it being a regular file. Just like the documentation for -w et
al doesn't say anything about the file being regular. -w, -f, and -T
are mutually orthogonal.
Clearly, reading from non-regular files is intentional behaviour,
implemented consistently, and documented. The choice of such a predicate
for this short spelling may be a questionable language design decision,
but it's too late to change that. There isn't any viable case for
changing the existing behaviour. If you want a regular-text-file
predicate, by all means write your own and stick it on CPAN.
This ticket should be closed.