In UNIX, everything is a file, or so the story says. A "file" is anything that tries to act like a stream of bytes, and obviously this can include much, much more than data stored on disk. Having everything (data, hardware devices, network sockets, directories, and more) as a file allows for great flexibility. Code written with one type of file in mind (typically disk data) can be applied to other types of files, even in ways the original coder didn't plan for. Most importantly, each program's input ("stdin") and output ("stdout") are files, and one program's stdout can be used as another program's stdin, possibly many times in succession, producing what are known as pipelines. In this manner, small programs can be chained together in innumerable possible combinations to produce functionality potentially far beyond what the programmer initially conceived. For example, "grep ^r.*t$ /usr/share/dict/words | tr a-z A-Z | sort -r" looks in /usr/share/dict/words, finds every word beginning with "r" and ending with "t", and prints it to the screen, capitalized and in descending order. "cat /dev/fd0 > ~/img" copies the contents of the floppy disk to a file in the user's home directory, and "cat ~/img > /dev/fd0" writes the disk image back to the floppy, which is useful for making duplicates. These examples make use of programs that do pattern matching (searching), letter replacement, sorting, and displaying text, yet when combined they can do very interesting things seemingly beyond the scope of their original design.