Bioinformatics unavoidably requires prediction. There's only so much an analysis in silico will tell us about phenomena in vivo.
I'm worried that, as the availability of biochemical and genetic data becomes taken for granted, prediction will be replaced by meaningless, slavish futurism. It's already happening: a recent Wired magazine cover about a piece on CRISPR uses the headlines "No hunger. No pollution. No disease. And the end of life as we know it." Yes, it's hyperbolic. Yes, CRISPR-based technologies are already proving their use. It still falls into the same trap as so much other Wired-caliber futurism: the technology is advancing so rapidly that the futurism is already outdated. Unintended consequences can be addressed because they aren't some kind of unstoppable runaway train, they're a present topic of study.
A piece on Medium from a few months back labeled this kind of streamlined thought as futch (rhymes with brooch, not butch or crutch*). The conventional wisdom has been that it's how we make big ideas accessible while retaining their freshly-polished chrome glint. Leave the gritty details to the scientists and engineers. If you're in one of those two groups, go interdisciplinary. Find someone outside your field to fill in those details you can't handle. Can't build the future yourself? Watch a TED talk! Join a Facebook group! Wait for the Future to land in your backyard!
The problem arises when the glossed-over details are those most relevant to everyday life. This is especially true of bioinformatics, as we've seen with issues like 23andMe's handling of their customers' sequence data.** Sure, personal genome sequencing seems like a hallmark of the Future, but it's here now and questions like "who owns my genetic data" are relevant to everyone with a genome.
*Or however Kanye pronounces it. It's just the vacay-form. Don't overthink it.