This is tested using gcsfuse mount in a google computer engine running Ubuntu 15.10. These tests are made using a large 128MB file. I read that using the gcsfuse goes very slow when using lots of little files. For instance using rsync to backup websites. Google suggest using their gcutil with the rsync option instead.
Nearline:
Write
Using Zeros
dd if=/dev/zero of=nearline-container/test.data bs=1k count=128k
131072+0 records in
131072+0 records out
134217728 bytes (134 MB) copied, 4.93824 s, 27.2 MB/s
Using Random
dd if=/dev/urandom of=nearline-container/random.data bs=1k count=128k
131072+0 records in
131072+0 records out1
34217728 bytes (134 MB) copied, 13.8312 s, 9.7 MB/s
Read
Using Zeros
pv nearline-container/test.data >test.data
128MiB 0:00:00 [ 131MiB/s]
[===================================================================>] 100%
Using Random
pv nearline-container/random.data >random.data
128MiB 0:00:00 [ 130MiB/s]
[===================================================================>] 100%