Dataflow is the most efficient Computational Paradigm. Jack Dennis, the father of Dataflow has defined it as "A Scheme of Computation in which an activity is initiated by presence of the data it needs to perform its function". New ways of exploiting this principle are nowadays available. In terms of programming models we see at least: OmpSs, OpenStream, DDM-TFLUX, DataFlow-Scala, SWARM, just to name a few. Start-ups like CAPS, MAXELER, are also making available powerful dataflow based technologies. How effective are those technologies? Is there a clear winner? How complementary are those approaches in comparison to other best solutions available in the market? The invited experts moderated by prof. Roberto Giorgi tried to answer the above and other questions in front of a plubic of about 50 people during the MULTIPROG workshop at HiPEAC conference in Berlin, Germany. The panel of experts included: Prof. Eduard Ayguade - BSC/UPC, Prof. Ian Watson - The University of Manchester, Prof. Albert Cohen - INRIA, Joshua Sutterlein - University of Delaware, Oliver Pell - MAXELER, Lawrance Rauchwerger - TAMU.