Panel 19:32

Edge Serverless: Discussion by Vox Media, Mux, & PerimeterX | Fastly

Presented by
speaker avatar

Kimmie Nguyen

Moderator: VP of Product Growth, Fastly

speaker avatar

Pablo Mercado

CTO, Vox Media

speaker avatar

Phil Cluff

Streaming Architect, Mux

speaker avatar

Ido Safruti

Co-founder & CTO, PerimeterX

twitterlinkedinfacebook

Meet the pioneers who are building on Compute@Edge and helping shape the future of serverless. This crew of innovative thinkers and problem solvers are actively exploring how to harness the power of Compute@Edge to build the future of their applications — and dreaming up new possibilities in the process. We’ll discuss what they’ve built so far, how they’d like to see the product develop, and where they think serverless is headed.

Edge serverless: the future of the web

See how Compute@Edge is enabling new levels of performance for both developers and users.

Video Transcript

 Kimmie Winn (00:08):

Hi everyone. Welcome. I'm Kimmie Winn, Vice President of Product Growth at Fastly. And I'm really excited to lead today's conversation about Compute@Edge and Future of Serverless. I've been here since the early days of Fastly and the vision of Compute@Edge has been baked into our DNA from day one. Joining me today are three leaders at organizations who've been innovating on Fastly's serverless compute environment. We'll take this opportunity to learn more about their experiences, discuss where they see the market heading. Gentlemen, thanks so much for your time. How about we start with some brief introductions.


Pablo Mercado (00:45):

I'm Pablo. I am the Chief Technology Officer at Vox Media. I've been at Vox Media since the beginning, when we literally started in a basement maybe 13 years ago and it's been really fun to build a business along the way. Fastly has been a key partner for us, so I'm very happy to be participating in this conference.


Phil Cluff (01:04):

I'm Phil. I work for Mux, which is a small San Francisco based startup. I've been in the streaming video industry for about 10 years now. I've worked at places like the BBC then at Break over at OVP, who's also a partner at Fastly. And then obviously at Mux we, among other things, build video services so API driven video platform, as opposed to kind of your more traditional RVPs and also kind of analytic systems for the biggest customers like Fox for the Super Bowl, CBS for the Super Bowl the year before.


Ito Safruti (01:45):

Hi, my name is Ito Safruti, I'm CTO and founder of PerimeterX.


 Kimmie Winn (01:49):

As more and more organizations get access to Compute@Edge, we see the different companies are intrigued for different reasons. Some tell us speed is most important, some tell us security's the reason they want to build on it, others say it's the fact that it operates on the edge. So I really want to hear what aspects of Compute@Edge are most attractive to you and your teams.


Phil Cluff (02:10):

I think Compute@Edge is attractive to us for a variety of reasons. I think to us, as close as to the edge as we can get, it allows us to optimize for performance, right? So we primarily deal with media, right? Video files, audio files, and the closer that they are cached to user, obviously the better the performance or the kind of the end viewer, which means less buffering, which everyone wants, right? But it's a little bit more than that because there's a lot of devices out there. There's smart TVs, receptor boxes, there's hundreds of different types of smartphones and sometimes you need to manipulate that media to make it work optimally for a particular device. And for us what's most exciting about doing more at the edge is pushing out media manipulation to the edge, right? So if we have to have three different formats of a video, which is believable, we're going to have to store cache that free time, it's going to have to do three round trips to the origin, depending on which device is playing it, and where and how and those sorts of things.


Phil Cluff (03:18):

So being able to manipulate things like media, but also manifests closer to the edge is really important to us because it improves round trip time for streaming, but also it reduces cache footprint, which is huge, right? If you're storing three copies of every object in cache that's challenging and reduces your cache performance, but also increase cost ultimately because you've got to... Things fall out of cache more frequently and you end up just spending more time doing round trips as well. So really for us, it's a combination of doing it as close as we can to the user, but also doing as much of it on the edge as we can to improve performance, but also improve that cache flow.


Pablo Mercado (03:58):

Actually, it's not just a CDN for Vox Media, I consider it to be a business delivery network. Our portfolio of brands, The Verge, New York Magazine, vox.com, our concert advertising partners, our SaaS customers reach millions and millions of people every day with our podcasts and videos and digital media content and their first point of contact, all this traffic goes through Fastly. So it's exciting to have Compute@Edge capabilities available to us to create and optimize our experiences for these audiences right there at the edge, right at this first point of contact. It has enormous potential from a performance perspective, obviously having logic and caching executed right close at the first point of contact is fantastic around performance perspective, but it also has enormous potential from a creativity perspective, and that's really what excites me.


Pablo Mercado (04:54):

This is like... Having Compute@Edge is like having a middleware stack suddenly available for our entire business. I know many of you all, and engineers out there, have built systems where inserting some new logic upstream, it can be brittle. Then you put in a middleware kind of stack and it gives you the flexibility to it, certain remove logic where you need it to be in that kind of enables a lot of flexibility and safety in what you build. And that is what I see with the capabilities that Compute@Edge provides. Suddenly our developers and creative thinkers can assert business logic or test out new functionality with increased, and I think fantastic, agility and also this is the kicker, this is really the promise of serverless with great confidence in the performance and stability that's inherent in the Fastly network. So it's really exciting.


Ito Safruti (05:50):

So for us, the benefit and the reason we're so excited about Compute@Edge is the ability to run our code and enhance the Fastly ecosystem for our customers without the need of adding another shim or another application from our end in it. And by that enhancing without complicating the ability to add complexity or logic, security logic, for our security products, if it's bot mitigation or detection of malicious code on the client side, being able to introduce these capabilities without the customers needing to change anything on their backend on their application and being able to handle all this malicious traffic or malicious activity on the edge away from their data center and away from the application, is what excites us and what makes it so great for us, and so maybe for us, the ability to run on the edge.


 Kimmie Winn (06:50):

Part of the reason I was so excited to be on this panel was that you're all in different phases of trialing Compute@Edge. Some of you are being creative and dreaming, others are testing and building, and I really want to hear about what you've done so far and what you're planning to do next.


Phil Cluff (07:08):

So we're very much in a prototyping phase, as many people are. We have very complex VCL logic right now, everything from authentication through to manifest manipulation, which is rewriting text files that control how video is delivered to end users. So the first thing we really want to do is take that logic and move it into something that's more unit testable and more deployable on different platforms and that sort of thing. So taking all that kind of years of VCL that we've developed and turning it into something that's way more future facing.


Phil Cluff (07:50):

The next steps beyond that really start to be kind of more aggressive per user authentication. I'm super interested, obviously in finding out kind of what Ito's got in kind of the security perspective there because that's really interesting and important to us so that if you're watching a stream you've got the rights to watch that stream, that sort of thing. And obviously then down the road, it's really about thinking about media manipulation at the edge.


Pablo Mercado (08:13):

We're also in a prototyping mode and it's been really exciting so far, we see a lot of potential. We've prototyped content, API gateway type functionality, authentication authorization layers, that's been really interesting. We're really excited about content stitching services, is something we're prototyping as well that can selectively stitch content, advertising logic together, et cetera. You know, this area is exciting. We do a lot of content customization and content translation for different outputs, and we need to tailor content for devices, network, conditions, location, et cetera. We also personalize content and personalize functionality, et cetera, being able to do that at the edge, and importantly, in a way that's flexible, that can be integrated with ease and safety and with great developer economics is huge for us. And also we are also very excited about making our VCL logic more accessible to our wider engineering teams. That's going to be important for us as well.


Ito Safruti (09:15):

So like Phil and Pablo, we're also in the prototyping phase and exploring and testing it with a couple of customers. Because of the nature of our business we are adding logic to our customers edge, which are joint Fastly customers. So originally what we did is we implemented our entire logic and VCL, which made it a bit complex, especially when customers may have their own business logic on VCL and they have complex VCL. What wee did so far is enhance that and leveraged the edge, Compute@Edge, to simplify that as well as to enhance it and add capabilities that were not necessarily available in VCL, like API calls to our origin, to our cloud service, to enhance the decision, whenever the decision at the edge couldn't be fully determined.


Ito Safruti (10:17):

This is simplifying the integration. This is making it much easier and as well as more flexible and adding more capabilities. And another application that we're doing with edge is actually leveraging edge dictionaries to enhance and to start introducing more state into this decision, by pushing some security logic and security rules in a very dynamic way, and not as static configuration that is there for there. And this is where we're also leveraging and taking advantage of the snappiness of configuration updates done by Fastly and that the deal code to write and read from edge dictionaries that helps us push decisions and make it even more independent and in the edge, not requiring or eliminating some of these API code that we tend to use before.


 Kimmie Winn (11:11):

Our roadmaps are built hand-in-hand with our customers. So I'd love to hear where do you see Fastly taking Compute@Edge in the future?


Phil Cluff (11:18):

I think for us, it's probably going to be a different answer to everyone out here, but one of the most important things for us is getting everyone in this market involved in Bytecode Alliance, right? Being a media company, multi-CDN is critical. And Fastly is a big part of that multi-CDN, not only as one of the edge delivery networks that we rely heavily upon, but also as for example, an origin shield using the media shield product. But being in media that there isn't one CDN that's ever going to reach every user as performant as every other user. Where geos that require specialist CDNs and even delivering into private networks. So getting to a place where the Bytecode Alliance and we generate something that can be given to a large proportion of CDNs out there is critical to making Edge Compute a realistic piece of day-to-day computing on kind of media platforms in my mind. And getting more storage on the edge is really important as well. I think data at edge is a great step forward on that. But yeah, if we're going to do more complex manipulation, that means we need data stores that are as close as possible to the edge as well.


Pablo Mercado (12:35):

I agree with Phil here that data at edge will be important, as well as the maturation of developer tools and operational tools. You know, developers, developers, developers, I think is the mantra. So I want to see great tools for building an operating logic with Compute@Edge as we build more and more complex services on this platform. And these capabilities that require access to data, so data at edge is going to be very important as well.


Ito Safruti (13:11):

What I would love Fastly to do, and to enhance Compute@Edge, in many cases, similar to what Pablo and Phil were saying, I think data is key and enhancing data on edge, bringing market capabilities, taking edge dictionary to even a further and more complex or more complete, some larger data or other data types will help bring more applications to the edge, will help provide more stateful applications.


Ito Safruti (13:42):

The other thing that I strongly agree with Pablo is about dev tools. Dev tools because it's all about developers and enabling developers to build more applications, the ability to deploy easily, the ability to integrate to dev studios and the ability to debug and optimize performance and analysis because they are becoming a critical part of the application is another key that I'm looking forward to see.


Ito Safruti (14:09):

One last thing and it's more on my angle within this panel as a vendor partner with Fastly and offering services on the edge to customers that are using Fastly, building and enhancing this environment to be more vendor friendly or more vendor safe. For instance, the ability for us as a vendor to have access to data on the edge that is limited to us without giving us access, for instance, with a key to the entire application of the customer, to the entire data store of the application. So that we're not breaching any privacy or not by mistake, potentially getting access to any service that a customer may be having, and limiting the access only to the module that we would want to enable for the customer.


 Kimmie Winn (14:54):

So let's talk about the future. Based on your personal experience and the perspective you bring from your industries, where do you see the serverless market heading in the next five years?


Phil Cluff (15:05):

I think, to touch on what I said earlier, I think interchangeability is critical, but we deploy a lot of apps in serverless platforms that aren't kind of, you're just kind of executing little bits of code at the edge. For example, we work with platforms like the cell to build more of our application layer, kind of more things like our marketing sites and things like our actual dashboards and things like that are built there. And I think getting to a place where we can deploy static sites into platforms like this. So that's become really, really interesting and getting to a place as the ecosystem grows, obviously being able to put together more componentized offerings, right? Things from different vendors that you can plug together that created a more plug and play manner.


Pablo Mercado (15:53):

I see there being a transition from the building blocks that we have now, which are really exciting, to complete product and service offerings. I see the emergence of an ecosystem of serverless and Compute@Edge products and services. Customers will be able to enable or pay for at edge third-party products and services that may be built on the Fastly platform and available through a marketplace. That would be really exciting. You could pick out functionality enhancements that you want for your business.


Pablo Mercado (16:33):

And perhaps in the future, solutions will be portable across serverless platforms. I think that would be really exciting as well. So we're going to see complete solutions. We're going to see a marketplace, an ecosystem being built out rather than just building blocks. And we're going to see the edge being used to build out these solutions rather than just enhancing underlying apps. For example, I think the WordPress of the future will probably be built at the edge.


Ito Safruti (17:08):

I would like to differentiate between serverless in general and serverless on the edge or Compute@Edge. We are seeing a big transition already in the last couple of years to serverless application data centers or within the cloud vendors, where they do have access to data and to data stores because they're already running there. And I think the ability of the edge to add these capabilities and make it more and more applicable to that will help transition this kind of payloads and applications to the edge. And I think this will be a very exciting moment where these two motions can converge or, for the developer perspective, there won't be any difference between serverless in a data center and serverless in edge. I think that will present great opportunities.


Ito Safruti (17:58):

And the other thing, to continue on what Pablo was saying, is that the ability for vendors and for other open source modules and libraries to be available so that developers can stitch up modules and build applications faster, either by integrating more components or by leveraging third-party vendors on the edge to simplify application and to help the developers and to help you basically build whatever the logic that is unique to you and not necessarily implementing a lot of logic that is common for many other services.


Ito Safruti (18:39):

I see that as a similar transition to what we've seen on front-end as more and more code is running on the front-end, where front-end developers can leverage a lot of vendors to integrate capabilities on the browser and to utilize open source libraries so that they can write less code and show more efficient and they move faster.


 Kimmie Winn (19:01):

I wish we could keep going, but it looks like we're out of time. Thank you. Pablo, Phil and Ito for sharing your thoughts today. Some of the things that you spoke about like performance, agility, extension of VCL, complex logic at the edge, these are core tenants of what we're building and I'm really glad that you highlighted them. So I really enjoyed the conversation and hope you enjoy the rest of your day.


Ready to get started?

Get in touch or create an account.