WEBVTT

00:00:12.680 --> 00:00:16.620
Scott: Welcome to the Bikeshed Podcast, where we talk about all things software engineering,

00:00:17.250 --> 00:00:23.320
Scott: current events, and the game of life. I'm your co-host, UI simp, Alacritus Pollyanna,

00:00:23.350 --> 00:00:27.860
Scott: and Coke Zero addict, Scott Kaye. And along my side with me is my co-host,

00:00:28.520 --> 00:00:35.280
Scott: Web Framework connoisseur, the most feared PR reviewer, and GitHub Copilot's own copilot, Matt Hamlin.

00:00:36.960 --> 00:00:38.840
Scott: Matt, what's our agenda today?

00:00:40.100 --> 00:00:41.280
Matt: What a wonderful intro.

00:00:42.700 --> 00:00:53.220
Matt: Yeah, I think today we wanted to dig into a little bit about enshittification and like some topics are maybe related to that as it relates to web development, but maybe also more broadly.

00:00:55.340 --> 00:01:01.380
Matt: For those that don't know, enshittification is a fairly popular term in the recent maybe five years, maybe longer.

00:01:02.910 --> 00:01:08.520
Matt: Just sort of like things are getting worse is the maybe general consensus.

00:01:10.400 --> 00:01:18.720
Matt: It's like the quality of products that we buy is deteriorating or you pay more, but you get less.

00:01:20.030 --> 00:01:21.080
Matt: So those kinds of concepts.

00:01:21.300 --> 00:01:25.940
Matt: But then I think also looking at it from the lens of web development or just development in general.

00:01:27.800 --> 00:01:30.320
Matt: Yeah, I think that's kind of like a rough concept.

00:01:30.580 --> 00:01:38.060
Matt: I think with that, we also want to kind of talk a little bit about this blog post from Grant Slatton called Nobody Cares,

00:01:38.140 --> 00:01:42.040
Matt: which is kind of an interesting, maybe sort of related.

00:01:43.400 --> 00:01:50.460
Matt: And then also maybe dig into a term I think one of us coined, the "shadification" of the web.

00:01:51.320 --> 00:01:53.660
Matt: so when we get to that we can dig into that a little bit more

00:01:55.600 --> 00:01:57.460
Matt: yeah Scott like what's your take on

00:01:57.640 --> 00:01:58.760
Matt: enshittification like is it like, how do you see it?

00:02:02.020 --> 00:02:03.960
Scott: I think you summed it up quite well to be honest

00:02:04.900 --> 00:02:07.740
Scott: I kind of read it and did a little research about

00:02:08.110 --> 00:02:10.020
Scott: how people were using it and it was very much

00:02:11.020 --> 00:02:12.700
Scott: kind of how you're describing it is

00:02:13.630 --> 00:02:17.060
Scott: we have these software products out there that were once great

00:02:17.680 --> 00:02:20.480
Scott: and then now they're just currently

00:02:20.500 --> 00:02:26.040
Scott: trending downward they're not really adding new value i can i can name some examples of like

00:02:26.680 --> 00:02:31.580
Scott: things that i've noticed recently like i i listen to some podcasts on spotify and now i get

00:02:31.740 --> 00:02:38.860
Scott: commercials so the the price just goes up you're not really getting much new things i will say

00:02:39.000 --> 00:02:45.640
Scott: spotify has had some some good new stuff like um being able to do jam sessions and whatnot they're

00:02:45.660 --> 00:02:50.460
Scott: still creating features but an example of like you're paying money and now you're getting ads

00:02:51.060 --> 00:02:56.940
Scott: that seems to be counterintuitive or netflix is a good example where um you know you used to just

00:02:56.940 --> 00:03:04.560
Scott: pay $7.99 it's just the now you pay $25 for the the premium if i'm if i'm correct otherwise you don't

00:03:04.640 --> 00:03:14.260
Scott: get um hd 4k or or you get commercials so they're starting to just say oh how can we make more money

00:03:14.780 --> 00:03:18.360
Scott: and not give more, but rather make people pay for the top tier,

00:03:18.730 --> 00:03:20.020
Scott: which is smart marketing,

00:03:20.640 --> 00:03:26.020
Scott: but it's not innovating on what the product was prior.

00:03:27.300 --> 00:03:30.240
Scott: And that's not to say that those are actually two companies

00:03:30.620 --> 00:03:34.180
Scott: that I actually think are good in the tech space

00:03:34.310 --> 00:03:35.320
Scott: and still do good things.

00:03:35.920 --> 00:03:38.580
Scott: But there is just this product's been around for a while.

00:03:39.380 --> 00:03:42.120
Scott: And instead of trying to see where we can innovate

00:03:42.120 --> 00:03:49.500
Scott: and add new value it's been very much how can we monetize what we have and um the product is kind

00:03:49.500 --> 00:03:55.320
Scott: of going uh south i guess a better example would have been facebook something that people loved

00:03:55.620 --> 00:04:01.920
Scott: maybe a decade and a half ago when it first came out it was very exclusive uh and even like when

00:04:02.260 --> 00:04:08.280
Scott: instagram came out what i don't know 10 years ago it was like i was in college um like people

00:04:08.300 --> 00:04:13.760
Scott: flocked to that and then older folks use Facebook and maybe maybe some people like Facebook but

00:04:14.840 --> 00:04:19.220
Scott: a hot take I feel like software engineers move further and further away from

00:04:19.820 --> 00:04:27.320
Scott: popular social media I just always feel that's been the case what about you Matt

00:04:19.820 --> 00:04:27.320
Matt: yeah like I mean

00:04:27.780 --> 00:04:31.260
Matt: on your last point I don't think it's limited to software engineers I think it's maybe just

00:04:31.520 --> 00:04:37.380
Matt: people our age is probably sure which is like maybe dating us or whatever but

00:04:38.620 --> 00:04:42.520
Matt: But yeah, I think, yeah, I mean, you're talking about some really good examples.

00:04:42.670 --> 00:04:50.920
Matt: I think other sort of trends that come to mind is like, you know, I think like every product out there now is like some has to include AI as part of it.

00:04:50.950 --> 00:05:04.720
Matt: And like, you know, it's like, OK, there's some cool there's like some cool AI integrations, but there's places where it's just like they're just like tacking it on just to like say that they are an AI company or, you know, offer AI features or whatever.

00:05:05.040 --> 00:05:08.480
Matt: but I think sometimes those just become more annoying.

00:05:08.930 --> 00:05:10.460
Matt: It's, I think, a really good example.

00:05:10.660 --> 00:05:12.260
Matt: This is even predating AI,

00:05:12.310 --> 00:05:16.620
Matt: but Comcast's sort of customer service, right?

00:05:16.760 --> 00:05:20.800
Matt: It's intentionally designed to not help you.

00:05:21.420 --> 00:05:23.720
Matt: And now that they have an AI chatbot

00:05:23.720 --> 00:05:26.920
Matt: in front of the actual person answering the call,

00:05:26.980 --> 00:05:31.980
Matt: it's like, you know, now it's even more so designed for that.

00:05:33.560 --> 00:05:34.100
Matt: So I don't know.

00:05:34.260 --> 00:05:45.580
Matt: I think there's a lot of examples of that, where it's just like all these companies are maybe building features that don't really serve the need of the original product.

00:05:45.760 --> 00:05:55.540
Matt: I think that's maybe a more broad sort of topic of companies eventually grow to the scale or the size or the product grows to the point where it's no longer serving the original use case.

00:05:56.220 --> 00:05:57.640
Matt: Sometimes that works out, sometimes it doesn't.

00:05:58.500 --> 00:06:06.340
Matt: But yeah, I think just in recent times, it feels like everything, like quality-wise of everything is like dropped off a cliff.

00:06:08.380 --> 00:06:08.900
Matt: Yeah, I don't know.

00:06:09.020 --> 00:06:11.580
Matt: But like hindsight is probably more important there, though.

00:06:11.700 --> 00:06:18.560
Matt: It's like, you know, it's like using Microsoft Word 20 years ago or something was probably, you know, probably also had a lot of pain points and a lot of issues.

00:06:18.820 --> 00:06:20.380
Matt: But using it today is probably even worse.

00:06:20.510 --> 00:06:20.900
Matt: I don't know.

00:06:22.100 --> 00:06:28.360
Scott: You brought up bringing up AI is really great because that is definitely the current conundrum, I guess.

00:06:29.000 --> 00:06:43.140
Scott: But you made me think of like recently I was looking at product reviews and on Amazon and they have AI to summarize like everything in the review to tell you like maybe like what's included with the product to try to find an answer.

00:06:43.880 --> 00:06:46.740
Scott: And I feel like, yeah, that's kind of helping the user.

00:06:46.930 --> 00:06:53.800
Scott: But instead of just making a better UI and we're going to get into this or make it easier to discover product information.

00:06:54.940 --> 00:06:56.000
Scott: Amazon is a great example.

00:06:56.300 --> 00:07:06.720
Scott: I've wanted to talk about this forever that like a lot of product sites like we've built product sites, web websites.

00:07:06.930 --> 00:07:13.420
Scott: They don't really care about selling the actual product or they just want to get as many products up.

00:07:13.500 --> 00:07:19.120
Scott: there as they can and the product details they at one point were more like we need to get the

00:07:19.380 --> 00:07:26.220
Scott: details up up front but now they seem very secondary it's like how many products can we throw on a page

00:07:26.840 --> 00:07:32.500
Scott: to try to just sell something so it's like again all about monetization and less about

00:07:32.820 --> 00:07:37.660
Scott: the customer finding what they need but like i mean i use this example of amazon ai it's like

00:07:37.920 --> 00:07:43.460
Scott: yeah it helps me find like what i need but instead of just like making the page work for me

00:07:43.480 --> 00:07:53.240
Scott: throw AI in there to summarize it, which probably takes more effort and more page load or request

00:07:53.520 --> 00:07:56.440
Scott: time rather, just to get it to load for me.

00:07:56.460 --> 00:07:58.880
Scott: So that's one thing that you made me think of.

00:07:59.040 --> 00:08:05.920
Scott: And then the other, you talked about product growing where the original use case might

00:08:05.920 --> 00:08:06.260
Scott: be abandoned.

00:08:06.740 --> 00:08:12.400
Scott: Obviously, sometimes the products evolve over time and that mission statement changes.

00:08:13.320 --> 00:08:17.400
Scott: But this is something, you know, I like to harp on is like, what is our goal?

00:08:17.400 --> 00:08:18.020
Scott: What is our mission?

00:08:18.340 --> 00:08:25.340
Scott: I think when we were at Wayfair early, that was always really hitting home, hit home like that.

00:08:25.540 --> 00:08:28.880
Scott: We were trying to make it easy for people to find everything for their house.

00:08:30.500 --> 00:08:33.240
Scott: And I think that really does drive decision making downward.

00:08:33.479 --> 00:08:39.120
Scott: But we saw firsthand as products grow, people come and go.

00:08:39.360 --> 00:08:40.099
Scott: Missions change.

00:08:40.280 --> 00:08:41.500
Scott: People aren't always aligned.

00:08:41.979 --> 00:08:47.300
Scott: And what they're trying to build maybe is a different understanding to everyone.

00:08:47.400 --> 00:08:55.240
Scott: So just having that alignment to make sure that people are trying to build towards the same end goal or to see that the space that you're in might change.

00:08:56.820 --> 00:08:58.980
Scott: Try to think of a product, maybe not with an example.

00:08:59.080 --> 00:09:05.260
Scott: You might build something for a specific use case, but realize, hey, there's this use case over here and we need to pivot towards that.

00:09:07.399 --> 00:09:15.740
Scott: So inherently, I feel like enshittification is partially when they lose sight of a goal.

00:09:15.910 --> 00:09:17.980
Scott: And it's more of just like, how can we make money?

00:09:19.240 --> 00:09:21.920
Scott: Maybe it's not always a corporation, but we're really big.

00:09:22.290 --> 00:09:29.760
Scott: And we're kind of just throwing things at the wall to see what sticks for monetization value over customer satisfaction value.

00:09:31.300 --> 00:09:34.440
Matt: Yeah, I've been noodling on this topic.

00:09:35.400 --> 00:09:48.480
Matt: I think we've worked at companies where there's that sort of common phrase of as soon as a measure becomes metric, it's no longer valuable or something like that.

00:09:48.680 --> 00:10:00.780
Matt: I'm probably getting that wrong, but I think we both worked at places where the KPIs became far more important than actually delivering value.

00:10:01.880 --> 00:10:04.780
Matt: And I think that's an interesting thread.

00:10:06.820 --> 00:10:09.860
Matt: That's probably become something that's far more important

00:10:10.480 --> 00:10:14.400
Matt: maybe in the past 20 years or something than in prior times.

00:10:16.320 --> 00:10:22.100
Matt: And we've both also worked at places that compare themselves

00:10:22.360 --> 00:10:26.500
Matt: to a lot of other competitors and see how they rank up against others

00:10:26.560 --> 00:10:27.440
Matt: on different benchmarks.

00:10:28.820 --> 00:10:30.780
Matt: I've had this blog post stewing in my mind.

00:10:30.820 --> 00:10:37.320
Matt: of like you know like you like working to like get better at a benchmark does not make you a better

00:10:37.900 --> 00:10:43.260
Matt: product company um i think it's just like it you're focusing on the wrong problem you're sort

00:10:43.300 --> 00:10:47.800
Matt: of focusing on like how do you rank versus others versus like how do you actually solve the customer's

00:10:47.900 --> 00:10:53.320
Matt: needs um and so yeah i wonder if that's like yeah you know kind of like what you're saying like i

00:10:53.380 --> 00:11:00.119
Matt: wonder if this enshittification is like inherent in teams that just grow to a size that um start going

00:11:00.140 --> 00:11:02.860
Matt: after KPIs rather than solving customer needs.

00:11:03.800 --> 00:11:05.920
Matt: I think a lot of maybe CEOs or whatever

00:11:06.120 --> 00:11:07.120
Matt: sort of convince themselves like,

00:11:07.150 --> 00:11:11.800
Matt: oh yeah, if the KPI is a way to go after the customer need,

00:11:11.860 --> 00:11:14.100
Matt: I think that easily slips.

00:11:14.150 --> 00:11:15.400
Matt: It becomes a proxy metric

00:11:15.600 --> 00:11:17.300
Matt: and then that proxy gets further and further away

00:11:17.480 --> 00:11:18.480
Matt: from what they're actually trying to solve.

00:11:20.500 --> 00:11:20.900
Matt: Yeah, I don't know.

00:11:21.260 --> 00:11:25.360
Matt: It's interesting that there's this, I don't know,

00:11:27.120 --> 00:11:28.340
Matt: companies that start small

00:11:28.370 --> 00:11:29.840
Matt: and sort of really solve product needs.

00:11:30.120 --> 00:11:31.140
Matt: eventually grow to that size.

00:11:31.140 --> 00:11:31.780
Matt: I don't know if there's like,

00:11:32.240 --> 00:11:32.940
Matt: there's probably something else

00:11:33.140 --> 00:11:35.000
Matt: that sort of attributes to that.

00:11:37.360 --> 00:11:39.500
Scott: Yeah, I also think when like folks are aligned

00:11:40.660 --> 00:11:43.980
Scott: and they were like constantly building new products

00:11:44.260 --> 00:11:45.780
Scott: or just trying to build something small,

00:11:46.040 --> 00:11:47.880
Scott: get it out there, get some testing

00:11:48.360 --> 00:11:49.500
Scott: and then expand on that,

00:11:50.160 --> 00:11:51.460
Scott: we're always more successful

00:11:51.640 --> 00:11:52.440
Scott: because we were like,

00:11:53.420 --> 00:11:54.680
Scott: I guess for lack of better terms,

00:11:54.900 --> 00:11:56.780
Scott: sussing out what the user really wants.

00:11:57.200 --> 00:11:58.320
Scott: Because sometimes it's unclear

00:11:58.580 --> 00:11:59.560
Scott: and especially in a new space,

00:11:59.780 --> 00:12:01.020
Scott: Like, AI is a good example.

00:12:01.170 --> 00:12:02.960
Scott: Like, everyone's just building a chatbot,

00:12:03.290 --> 00:12:04.480
Scott: but is that really what a user needs?

00:12:04.920 --> 00:12:06.420
Scott: Like, maybe, yeah, sure.

00:12:07.900 --> 00:12:10.620
Scott: But there are probably more applicable use cases that,

00:12:11.160 --> 00:12:12.800
Scott: I mean, I had the Amazon one.

00:12:13.030 --> 00:12:14.520
Scott: I don't know if that's, like, the best example,

00:12:14.610 --> 00:12:15.600
Scott: but it is helpful.

00:12:16.860 --> 00:12:18.320
Scott: Maybe clean up the rest of your UI.

00:12:19.240 --> 00:12:20.480
Scott: But you also made me think of, like,

00:12:20.530 --> 00:12:23.780
Scott: when companies grow, you'll see,

00:12:24.060 --> 00:12:25.660
Scott: and people come and go,

00:12:26.220 --> 00:12:28.120
Scott: you'll see the same tests ran over and over.

00:12:28.140 --> 00:12:36.740
Scott: um we did i'll never forget i don't know how many header tests tested different navigations uh

00:12:37.520 --> 00:12:47.480
Scott: in my tenure at w but it's more than 20 plus and we always just like got the same data and

00:12:47.840 --> 00:12:51.640
Scott: we did record data places but like clearly it wasn't

00:12:53.800 --> 00:12:59.920
Scott: like enough of a necessity to record it and make it available and easy to find so that we could

00:13:00.580 --> 00:13:06.360
Scott: learn from it instead of doing it again and again and again with this concept that like oh maybe the

00:13:06.440 --> 00:13:13.860
Scott: way users shop for things changes sure i do think that like there is some of that but that kind of

00:13:13.870 --> 00:13:20.980
Scott: gets more towards the what we'll get to as our next topic about how a lot of web apps are starting

00:13:21.000 --> 00:13:27.420
Scott: to use like similar pieces or the same pieces but we were never really creating anything net new that

00:13:27.620 --> 00:13:32.480
Scott: hadn't been done before it was just how we were going to interpret for the user the easiest path

00:13:32.690 --> 00:13:41.080
Scott: forward um and i i wanted to just kind of go down this lane of often i think as companies grow or

00:13:41.080 --> 00:13:45.860
Scott: at least what we i have seen is they start prioritizing feature feature feature feature

00:13:46.300 --> 00:14:01.540
Scott: They're not thinking enough about the value that feature builds, but rather appeasing that stakeholder, that these features are being delivered and that they might have value, and weighing the value and pros and cons of different features.

00:14:02.740 --> 00:14:10.180
Scott: For years, we didn't have a very specific feature, and that was like comparisons.

00:14:10.300 --> 00:14:15.220
Scott: like a lot of the sites that could compare products and we never had that but we also would rebuild

00:14:15.360 --> 00:14:19.040
Scott: these features man i know you want to say something i have another thought after this but go ahead

00:14:19.520 --> 00:14:24.160
Matt: yeah yeah i guess like on the on like the feature building like i think it's less

00:14:25.300 --> 00:14:28.960
Matt: like releasing a lot of features i think it's more of like releasing a lot of optimizations

00:14:29.590 --> 00:14:34.120
Matt: i think those optimizations aren't like i think internally in those big companies they sell it as

00:14:34.140 --> 00:14:40.980
Matt: features but but like for the end user product right it's like it's rarely is it actually a

00:14:41.160 --> 00:14:44.920
Matt: feature versus just like like you were saying in terms of like the a-b testing sort of flow it's

00:14:45.000 --> 00:14:48.800
Matt: like they they they you know with all those different tests for the header they weren't

00:14:49.439 --> 00:14:54.140
Matt: necessarily always releasing new features for the header they were just like reshaping what the

00:14:54.140 --> 00:14:57.880
Matt: header looks like or changing some of the functionality of the header but like not actually

00:14:58.020 --> 00:15:03.420
Matt: like you know it's like you know at that time of wayfair we weren't introducing like brand new

00:15:03.460 --> 00:15:05.420
Matt: paradigms on how headers

00:15:05.690 --> 00:15:06.420
Matt: work on websites.

00:15:07.900 --> 00:15:09.020
Matt: Yeah, I have a further point for that.

00:15:09.860 --> 00:15:10.720
Matt: I'll let you get back to it.

00:15:11.540 --> 00:15:13.060
Scott: I absolutely agree with that.

00:15:13.790 --> 00:15:15.760
Scott: But I guess more so often...

00:15:16.720 --> 00:15:17.640
Scott: When I would

00:15:17.740 --> 00:15:18.780
Scott: do it, I can only speak for myself,

00:15:19.460 --> 00:15:21.380
Scott: we would eradicate the old code

00:15:21.690 --> 00:15:23.400
Scott: and write net new code.

00:15:23.880 --> 00:15:25.100
Scott: It is definitely true that

00:15:25.540 --> 00:15:27.380
Scott: as we grew, using

00:15:27.450 --> 00:15:29.300
Scott: the header again as an example, it seemed like teams

00:15:29.330 --> 00:15:31.040
Scott: just moved a thing around.

00:15:31.440 --> 00:15:33.400
Scott: They were not really...

00:15:34.020 --> 00:15:35.380
Scott: starting and rebuilding a thing.

00:15:36.070 --> 00:15:39.080
Scott: So the way I see it as it being a new feature

00:15:39.320 --> 00:15:40.760
Scott: is like we just built some new code

00:15:40.940 --> 00:15:42.080
Scott: and replaced some old code,

00:15:42.280 --> 00:15:44.840
Scott: but that may not be how everyone does it, but continue.

00:15:46.180 --> 00:15:49.700
Matt: No, yeah, I think, yeah, like I think internally, right,

00:15:49.880 --> 00:15:51.940
Matt: like you can look at it as like a new feature,

00:15:53.200 --> 00:15:54.240
Matt: it's like new, right?

00:15:54.360 --> 00:15:55.700
Matt: But I think if you look at it

00:15:55.710 --> 00:15:57.420
Matt: from the perspective of like a customer, right,

00:15:57.560 --> 00:15:59.340
Matt: like the visitor to the website,

00:15:59.530 --> 00:16:02.580
Matt: like they don't care if, you know, for example,

00:16:02.860 --> 00:16:04.560
Matt: Like migrating tech stacks, right?

00:16:04.740 --> 00:16:06.420
Matt: Like they don't care if the previous tech stack was,

00:16:07.220 --> 00:16:10.080
Matt: you know, or like if the team behind the product

00:16:10.250 --> 00:16:11.680
Matt: like swapped out the tech stack, right?

00:16:11.860 --> 00:16:14.880
Matt: Like the features generally usually are like on parity

00:16:14.970 --> 00:16:16.920
Matt: with what they're using, you're used to before.

00:16:18.300 --> 00:16:20.760
Matt: I think more broadly, it's like making me think of like,

00:16:21.160 --> 00:16:21.880
Matt: you know, companies become,

00:16:23.340 --> 00:16:24.620
Matt: and this like phrase sort of came to me

00:16:24.630 --> 00:16:25.620
Matt: as you were talking about like companies

00:16:25.760 --> 00:16:27.260
Matt: sort of become optimization engines

00:16:27.640 --> 00:16:28.880
Matt: over innovation engines, right?

00:16:28.960 --> 00:16:30.980
Matt: Like I think when you're a small company,

00:16:31.280 --> 00:16:32.380
Matt: you focus on innovation, right?

00:16:32.460 --> 00:16:43.600
Matt: You focus on how can you solve a problem in a new way compared to people already in the market or people that have tried solving that problem in the past.

00:16:44.200 --> 00:16:50.920
Matt: And then when you're a big company, you start to focus more on these A-B tests, like these slight tweaks to different things.

00:16:51.280 --> 00:16:53.960
Matt: It's like we've probably heard that popular article.

00:16:54.020 --> 00:16:58.620
Matt: I don't know if it's true or not, but like Google A-B tests the color blue on the links.

00:16:59.640 --> 00:17:06.500
Matt: So, yeah, and, you know, it's like, oh, we changed the hue of the blue and it's like we get 10 percent more conversion on people clicking through to ads or whatever it is.

00:17:08.300 --> 00:17:11.280
Matt: And so it's like, yeah, that's like an optimization, not a innovation.

00:17:11.589 --> 00:17:23.600
Matt: Right. So I wonder if it's just like so many people are just like used to or like maybe like those optimizations are seen as just like not making the product better.

00:17:23.880 --> 00:17:25.800
Matt: Like they're not innovating in the space that people expect.

00:17:26.120 --> 00:17:32.700
Matt: You know, it's like, it's like completely different scale, but it's like people asking like where the flying car is, right?

00:17:32.840 --> 00:17:36.700
Matt: Like, you know, we're not seeing that kind of growth happen.

00:17:38.500 --> 00:17:38.580
Matt: Yeah.

00:17:39.580 --> 00:17:41.620
Scott: So you're actually, you're changing my perspective a little.

00:17:41.680 --> 00:17:45.940
Scott: I do think there were like some, I guess, net new features that they were trying to build.

00:17:46.140 --> 00:17:54.260
Scott: But they're actually, you're now making me think that we were at our strongest when we were trying to build new things.

00:17:54.660 --> 00:17:57.980
Scott: Like, that's like a new product as part of the product.

00:17:58.240 --> 00:17:59.720
Scott: It could be like a feature within the product.

00:18:00.420 --> 00:18:02.120
Scott: And we were trying to iterate on that.

00:18:02.660 --> 00:18:04.960
Scott: And that was probably when we were at our best.

00:18:05.420 --> 00:18:08.660
Scott: But what I wanted to bring up is what you're talking about, like optimizations.

00:18:10.260 --> 00:18:13.780
Scott: They were optimizations, but at the cost of not looking at performance.

00:18:16.500 --> 00:18:21.780
Scott: I think maybe other companies did different, but we wouldn't prioritize performance enough.

00:18:21.980 --> 00:18:50.440
Scott: It would constantly be brought up that we either needed some sort of performance metric to work against that teams would adhere to, or people believed performance was important, but it wasn't like, I guess, a first-class citizen in the work people did, or well-taught how to make sure that they delivered that, but rather it was very secondary in thought.

00:18:51.980 --> 00:18:57.620
Scott: my original point was like before you deliver something net new and and if you're a small place

00:18:57.720 --> 00:19:04.020
Scott: like performance can be a little easier you still have to care about it but like you might be more

00:19:04.160 --> 00:19:08.020
Scott: likely to have a smaller app and therefore be able to get more out it's not always the case

00:19:08.500 --> 00:19:14.460
Scott: um definitely have had implications growing growing apps at smaller places but you're able to

00:19:14.960 --> 00:19:19.699
Scott: like there's no one else there when something isn't performing in a small company and you're on the

00:19:19.720 --> 00:19:27.280
Scott: hook at a big company your goal is more to appease the stakeholder that they like what you do and

00:19:27.680 --> 00:19:32.560
Scott: it's easier to there's maybe a strong term but like scapegoat another team has to solve

00:19:33.190 --> 00:19:41.220
Scott: problems um which maybe this is like a more big company problem but um it's easier to

00:19:42.460 --> 00:19:44.080
Scott: act as if you have less

00:19:45.520 --> 00:19:46.160
Scott: ownership

00:19:46.280 --> 00:19:48.060
Scott: of a thing when

00:19:49.220 --> 00:19:50.120
Scott: you're described

00:19:50.160 --> 00:19:51.860
Scott: as owning a feature

00:19:52.520 --> 00:19:54.200
Scott: and you view it

00:19:54.660 --> 00:19:55.400
Scott: in a perspective

00:19:56.240 --> 00:19:57.640
Scott: that like I just own this section

00:19:58.799 --> 00:19:59.840
Scott: and especially

00:20:00.040 --> 00:20:00.220
Scott: like

00:20:01.440 --> 00:20:03.860
Scott: well this is maybe more specific

00:20:03.860 --> 00:20:05.860
Scott: to the background of like some folks

00:20:05.900 --> 00:20:08.200
Scott: at our company at the time

00:20:08.820 --> 00:20:10.079
Scott: just owned visuals

00:20:10.740 --> 00:20:13.900
Scott: and they never really cared about APIs and endpoints.

00:20:16.399 --> 00:20:18.980
Scott: And maybe it wasn't trained well enough that,

00:20:19.080 --> 00:20:20.720
Scott: hey, that is part of the front end these days.

00:20:21.500 --> 00:20:24.780
Scott: But understanding how bit,

00:20:24.850 --> 00:20:28.040
Scott: well, also having service teams that provide these for you,

00:20:28.590 --> 00:20:31.040
Scott: that blurs the lines a little bit more.

00:20:31.430 --> 00:20:36.300
Scott: But essentially, it's harder to draw ownership lines

00:20:36.500 --> 00:20:39.059
Scott: when there are more people owning things

00:20:39.220 --> 00:20:43.740
Scott: and there isn't clear defined guides but when you're in a smaller place you don't need clear

00:20:43.940 --> 00:20:50.820
Scott: defined guides like everyone has a stake in care and ownership so um the purpose of this getting

00:20:50.830 --> 00:20:58.920
Scott: back to and enshittification is sometimes it can happen because lines become blurred and people's

00:20:59.170 --> 00:21:06.219
Scott: motives are to a specific problem that they've been told is what their ownership area is or

00:21:06.720 --> 00:21:12.440
Scott: it's just all they think their ownership area is and not clear guidelines about working together

00:21:12.510 --> 00:21:18.720
Scott: as a cohesive unit and making the whole product better i mean people talk about 10x engineers all

00:21:18.800 --> 00:21:24.220
Scott: the time but like really all it means is that like you want to learn and understand the whole sequence

00:21:24.760 --> 00:21:31.120
Scott: of how the applications and the services work together and that's what makes you a strong

00:21:31.360 --> 00:21:36.200
Scott: engineer like i'll toot your horn a little bit matt but one of the things that made you so strong

00:21:36.220 --> 00:21:43.260
Scott: at W is you understood better than most how all the services worked and came together to make the

00:21:43.380 --> 00:21:49.440
Scott: application work and that is like a superpower and I've learned from that that you need to just

00:21:49.500 --> 00:21:56.260
Scott: make sure you understand the architecture first and foremost

00:21:49.500 --> 00:21:56.260
Matt: yeah I think I mean I think yeah yeah

00:21:56.260 --> 00:22:01.280
Matt: I think a lot of that stuff yeah I think it rings true but I think it's also like you know people

00:22:01.420 --> 00:22:06.180
Matt: listening might not understand like that you know the structure they're at internally I think really

00:22:06.560 --> 00:22:11.140
Matt: sort of the core emphasis is like at a larger company your focus your area of focus is like

00:22:11.780 --> 00:22:16.900
Matt: well is pretty small generally speaking you know it's like okay like as you get higher up in the

00:22:17.000 --> 00:22:21.340
Matt: stack like in the career you know career ladder at that company it's like yes you're going to have

00:22:21.340 --> 00:22:28.920
Matt: a broader perspective um but generally speaking most IC's are just very focused on one specific

00:22:29.040 --> 00:22:33.420
Matt: feature one specific product area and so yeah exactly like what you were saying it's like that

00:22:34.040 --> 00:22:40.840
Matt: leads to like, you know, sometimes people just don't really care about, you know, the broad,

00:22:41.070 --> 00:22:44.920
Matt: like how does this feature fit into the rest of the product or like the customer, you know,

00:22:44.920 --> 00:22:50.240
Matt: like what customer pain point is this solving compared to like the other aspects of the product

00:22:50.420 --> 00:22:57.480
Matt: that the customer is using, which I think like that kind of leads into maybe our like next sort

00:22:57.480 --> 00:23:01.700
Matt: of little section of this like blog post titled Nobody Cares. I think that's like,

00:23:02.840 --> 00:23:05.560
Matt: I think, I mean, maybe this is like not the way that the original,

00:23:05.750 --> 00:23:07.520
Matt: like originally the blog post was meant to be taken,

00:23:07.700 --> 00:23:09.940
Matt: but like, I think it's like,

00:23:10.180 --> 00:23:13.460
Matt: I've seen this sort of ring true where the, like, yeah,

00:23:13.640 --> 00:23:18.100
Matt: where engineers aren't empowered to take a more holistic point of view on the

00:23:18.260 --> 00:23:21.820
Matt: entire product. Right. It's like you're,

00:23:22.140 --> 00:23:24.600
Matt: if you're told that you can only work on, you know, feature a,

00:23:25.260 --> 00:23:29.200
Matt: you might not feel like you have the ability, the agency, you know,

00:23:29.400 --> 00:23:31.460
Matt: whatever the leverage to start, you know,

00:23:31.540 --> 00:23:35.380
Matt: sort of recommending changes or applying changes to feature B, C, and D.

00:23:35.760 --> 00:23:35.820
Matt: Right.

00:23:35.940 --> 00:23:39.000
Matt: So it's like you, in many ways,

00:23:39.000 --> 00:23:43.380
Matt: it's like whether or not it's like the actual like engineer doesn't care or

00:23:43.660 --> 00:23:46.800
Matt: they're just like sort of forced to a position where they can't,

00:23:48.280 --> 00:23:50.980
Matt: can't apply the energy to care for those other things.

00:23:51.540 --> 00:23:54.120
Matt: I think that's like the, like, I think,

00:23:54.120 --> 00:23:57.400
Matt: I think it's just like an interesting sort of perspective on that blog post.

00:23:59.280 --> 00:24:01.100
Matt: Yeah. I don't know. It's a, yeah.

00:24:02.200 --> 00:24:10.680
Scott: yeah and i think it's a scaling factor right like or it's also a strategy of how do we create

00:24:11.420 --> 00:24:17.700
Scott: like we talked about like pods like that was like the big thing that was and it never really changed

00:24:17.840 --> 00:24:24.560
Scott: but it was like okay people own these specific areas i would have argued where we were we there

00:24:24.560 --> 00:24:29.320
Scott: could have been larger pools of ownership that crossed over that would help with the the blurring

00:24:29.340 --> 00:24:37.560
Scott: of these lines but um i think the bigger point is when you can't define from the top

00:24:39.770 --> 00:24:47.440
Scott: clear responsibilities for the ownership of your job um and be held accountable so like i'm thinking

00:24:47.560 --> 00:24:53.200
Scott: kind of the RACI principles like if you can't define who's responsible or accountable for

00:24:53.340 --> 00:25:01.660
Scott: something like an endpoint speed or you don't have mechanisms to make it clear about who owns what

00:25:02.440 --> 00:25:07.880
Scott: because the size of your org is really big you this is you start to have these like fractions of

00:25:08.360 --> 00:25:14.220
Scott: you have a problem you know about it but you don't think you own it you might but assuming good intent

00:25:14.860 --> 00:25:19.780
Scott: you might not think that you own performance when there's a team that like is called performance

00:25:19.800 --> 00:25:35.400
Scott: Right. But then again, if there isn't like some clear guideline to go by to know what performance they own or who to go to to ask about these things or to care, you know, that that's where the breakdown kind of begins, too.

00:25:35.830 --> 00:25:48.200
Scott: So it's just like really setting expectations well and defining those expectations, because I do think like these way these things are broken down could be successful, but it's more maybe of a lack of education.

00:25:49.520 --> 00:26:00.000
Matt: Yeah, I would even argue that it's not even the, like, I think that the root problem isn't that it's like you don't know who to talk to about it.

00:26:00.280 --> 00:26:05.360
Matt: It's like that that's even the mindset that you come into that with, right?

00:26:05.480 --> 00:26:22.300
Matt: Like, it's like, if you're in a culture where you're not empowered to just make change, then you are going to naturally say, like, okay, I need to ask someone about, you know, I need to ask someone about if it's all right to make this change or if I should be caring about this or whatever.

00:26:22.960 --> 00:26:29.760
Matt: Versus if you're just empowered to make that change, which I think that's the case usually for most startups.

00:26:30.460 --> 00:26:37.280
Matt: The company is so small that there's literally no one else knows the answer to your question anyway, so you just have to do it.

00:26:38.480 --> 00:26:51.540
Matt: But in a big company, I think that usually the culture is very much like, yeah, get sign off, go through some red tape to sort of verify that you're allowed to make that change and then make the change.

00:26:53.340 --> 00:26:58.440
Matt: yeah I don't know I think it yeah like it is a I think that's just like another sort of cultural

00:26:58.880 --> 00:27:03.700
Matt: factor that leads into why you know why you know Grant in his blog post talked about like nobody

00:27:03.900 --> 00:27:09.120
Matt: caring right like they're not empowered to at least at engineering companies right they're not

00:27:09.340 --> 00:27:16.440
Matt: empowered to you know make the change they want to see or actually like empowered to like have

00:27:16.460 --> 00:27:19.540
Matt: the facility to care about the problems they're solving.

00:27:20.500 --> 00:27:25.340
Scott: Yeah, there's definitely more bureaucracy of change making.

00:27:25.440 --> 00:27:29.540
Scott: I remember someone would propose a change and people would be like,

00:27:29.540 --> 00:27:33.500
Scott: whoa, whoa, whoa, looking at the change, really scrutinizing that change.

00:27:33.900 --> 00:27:37.460
Scott: Because when you're siloed on a small team and you all start to work together and gel,

00:27:37.500 --> 00:27:39.180
Scott: you start to have common beliefs.

00:27:39.380 --> 00:27:43.260
Scott: But getting that out there to the rest of the organization

00:27:43.500 --> 00:27:46.160
Scott: or sharing and spreading those beliefs gets more difficult.

00:27:46.260 --> 00:27:52.020
Scott: unless your team is literally given time to document patterns for everyone to use.

00:27:52.650 --> 00:27:59.420
Scott: I'm clearly taking the stance of a team that is setting up a platform for my platform experience.

00:27:59.860 --> 00:28:03.180
Scott: But yeah, I am saying there's some lack of structure that is to blame,

00:28:03.360 --> 00:28:11.040
Scott: but I do agree with you to take that culturally working together

00:28:11.820 --> 00:28:15.440
Scott: and understanding and creating a culture of how we solve problems

00:28:15.460 --> 00:28:17.520
Scott: is also big to that.

00:28:17.560 --> 00:28:18.580
Scott: But I think that's like,

00:28:19.080 --> 00:28:20.960
Scott: like basically structure,

00:28:21.920 --> 00:28:22.360
Scott: a structure,

00:28:22.580 --> 00:28:24.020
Scott: like we need to like enforce structure

00:28:24.100 --> 00:28:25.820
Scott: from the top down to help inform.

00:28:27.940 --> 00:28:28.100
Matt: Yeah.

00:28:28.860 --> 00:28:29.200
Matt: Yeah, I think.

00:28:29.960 --> 00:28:30.080
Matt: Yeah.

00:28:30.580 --> 00:28:31.060
Matt: Yeah, I agree.

00:28:32.300 --> 00:28:32.400
Matt: Yeah.

00:28:32.480 --> 00:28:33.280
Matt: So I guess like,

00:28:33.680 --> 00:28:34.740
Matt: I think maybe we've gotten

00:28:34.960 --> 00:28:36.400
Matt: a little bit far away

00:28:36.640 --> 00:28:37.200
Matt: from gentrification,

00:28:37.580 --> 00:28:38.960
Matt: but I think another topic

00:28:39.220 --> 00:28:40.160
Matt: that came to mind

00:28:40.340 --> 00:28:41.900
Matt: as we were talking about this topic

00:28:42.120 --> 00:28:43.260
Matt: as like being something

00:28:43.260 --> 00:28:44.340
Matt: we wanted to talk about on the podcast

00:28:44.660 --> 00:28:44.960
Matt: was like,

00:28:46.700 --> 00:28:50.440
Matt: was sort of the, I can't think of a good word,

00:28:50.450 --> 00:28:53.620
Matt: but like sort of the explosion of websites

00:28:54.140 --> 00:28:56.740
Matt: that have started to use shadcn/ui,

00:28:56.910 --> 00:28:58.140
Matt: which I don't know if I'm pronouncing correctly,

00:28:58.230 --> 00:29:00.000
Matt: but I think that's the way you pronounce it.

00:29:01.040 --> 00:29:03.880
Matt: There's a pretty popular like component library

00:29:04.080 --> 00:29:05.900
Matt: that's a little bit different from most component libraries

00:29:05.950 --> 00:29:07.040
Matt: that people have used in the past

00:29:07.210 --> 00:29:10.440
Matt: in the sense that it's like a sort of like copy the code

00:29:10.510 --> 00:29:12.240
Matt: into your own code base instead of installing it

00:29:12.420 --> 00:29:13.540
Matt: from NPM or something.

00:29:16.360 --> 00:29:21.420
Matt: but I think what we've seen over maybe the past year or two I think it's been around at least two

00:29:21.540 --> 00:29:28.200
Matt: years is that more and more sites yeah more and more sites are just like like they all look the

00:29:28.360 --> 00:29:32.020
Matt: same you know it's like if any if anyone's sort of announcing like oh I'm building this like

00:29:33.040 --> 00:29:36.920
Matt: new you know generally these days it's like an AI product but like sort of new website

00:29:37.720 --> 00:29:42.320
Matt: and they share a screenshot it's like you can 100% tell that they just like pulled in

00:29:42.540 --> 00:29:45.140
shadcn and used it, like used the defaults.

00:29:46.540 --> 00:29:50.500
Matt: And I think it's like, it's an interesting pattern that we've seen.

00:29:50.910 --> 00:29:55.940
Matt: I think like a very clear telltale sign these days is if it's like monochrome.

00:29:56.510 --> 00:30:01.300
Matt: Like shadcn has colors, but like the default for everything is like black and white.

00:30:02.640 --> 00:30:07.360
Matt: I think it's like maybe people don't realize that they were meant to customize it when they

00:30:07.530 --> 00:30:07.920
Matt: added it.

00:30:08.210 --> 00:30:09.980
Matt: I think he's just like pulled it in and said, that's it.

00:30:10.540 --> 00:30:12.520
Matt: But yeah, I don't know.

00:30:12.680 --> 00:30:15.120
Matt: Yeah, Scott, do you have some thoughts there?

00:30:16.760 --> 00:30:18.140
Scott: Yeah, all of that.

00:30:18.340 --> 00:30:19.200
Scott: Yes, you're right on.

00:30:20.679 --> 00:30:22.180
Scott: You'd notice a lot of...

00:30:22.430 --> 00:30:24.040
Scott: And the style for websites,

00:30:25.050 --> 00:30:26.760
Scott: there is these trends that it goes through.

00:30:26.870 --> 00:30:29.820
Scott: But now it's like I have this marketing site

00:30:30.200 --> 00:30:33.260
Scott: where I have this really large hero

00:30:33.540 --> 00:30:38.700
Scott: with slight white or dark black to gray gradient text on it.

00:30:39.040 --> 00:30:40.180
Scott: It's really subtle.

00:30:40.860 --> 00:30:45.660
Scott: but there's also these like glowy like things going on in the background you can see it almost

00:30:46.260 --> 00:30:52.780
Scott: on like every every little startup site um but you are right shud's shadcn and he's even said it

00:30:52.900 --> 00:30:58.600
Scott: himself uh the the writer of it that like the goal is to give it to you this way so you could

00:30:59.040 --> 00:31:06.320
Scott: be a jump off point for you to create what you wanted um although that's doesn't happen often

00:31:06.340 --> 00:31:14.060
Scott: but anyway so yeah like it's meant to be customized but um i think what it's led to though and it's

00:31:14.060 --> 00:31:21.380
Scott: not just shadcn but it's like a lot of these ui systems and i guess design systems they they start

00:31:21.520 --> 00:31:27.280
Scott: to solve things the same like even when we worked on design systems we would look for like third

00:31:27.480 --> 00:31:31.700
Scott: parties that would solve certain little problems and now everyone's like okay i'm gonna solve that

00:31:31.720 --> 00:31:38.780
Scott: problem that way there's become like standards like radix right radix radix is one um but like

00:31:39.440 --> 00:31:45.640
Scott: you know handling handling accessible navigation or tabbing and whatnot like those kinds of

00:31:45.900 --> 00:31:51.180
Scott: standards start to creep in which those are fine but more so that it's just like okay we have this

00:31:51.380 --> 00:31:55.880
Scott: concept that's a modal and now we're getting like tags for it we have this concept that's a drawer

00:31:56.440 --> 00:32:01.680
Scott: and we start to just be like okay these are the pieces um and even when we were working on it was

00:32:01.900 --> 00:32:05.040
Scott: We would tell the design team, use the pieces, use the pieces.

00:32:05.340 --> 00:32:12.740
Scott: So the design team is constantly, or the design systems team is constantly being asked to fix those things.

00:32:13.000 --> 00:32:16.300
Scott: And then when they're looking for inspiration, they're just all looking at each other.

00:32:16.780 --> 00:32:19.500
Scott: We start to have all these same patterns that emerge.

00:32:20.000 --> 00:32:25.160
Scott: And like, yeah, you see like a sidebar nav or like a drop top drop down nav,

00:32:25.230 --> 00:32:30.060
Scott: but you don't see as much innovation or chances to like, how can we do things?

00:32:32.060 --> 00:32:33.800
Scott: how can we do things in a new light?

00:32:35.160 --> 00:32:35.880
Matt: Yeah, 100%.

00:32:35.910 --> 00:32:37.060
Matt: Yeah, I think it's like, you know,

00:32:38.140 --> 00:32:40.460
Matt: it's very similar to what we were just talking about before,

00:32:40.660 --> 00:32:41.640
Matt: where it's like these large companies

00:32:42.320 --> 00:32:44.600
Matt: start to focus on optimizations rather than innovation.

00:32:44.770 --> 00:32:46.180
Matt: And it's like, I think we've kind of seen that

00:32:46.190 --> 00:32:48.780
Matt: in the open source space with these component libraries

00:32:49.040 --> 00:32:49.820
Matt: like ShadCN, right?

00:32:50.000 --> 00:32:53.080
Matt: Like, you know, there's now probably, you know,

00:32:53.320 --> 00:32:55.960
Matt: more than we can count copies or clones of ShadCN

00:32:56.140 --> 00:32:58.760
Matt: that, you know, it's like, does the same exact thing,

00:32:59.000 --> 00:33:02.300
Matt: But, you know, maybe it's just, like, maintained slightly differently.

00:33:02.800 --> 00:33:05.840
Matt: Or maybe it's for a different web framework than React or whatnot.

00:33:06.140 --> 00:33:11.220
Matt: But, yeah, I think, you know, assuming the listener is still listening,

00:33:11.700 --> 00:33:14.600
Matt: it's like you're probably wondering, like, okay, like, okay, yeah,

00:33:14.820 --> 00:33:17.340
Matt: you know, like, how are we connecting shadcn to this, like,

00:33:17.460 --> 00:33:18.300
Matt: in enshittification topic?

00:33:18.840 --> 00:33:21.180
Matt: I think really the, in my mind at least, it's, like,

00:33:22.220 --> 00:33:25.020
Matt: pretty similar to, like, what we saw, like, the explosion of, like,

00:33:25.100 --> 00:33:26.340
Matt: sort of bootstrap based sites

00:33:28.200 --> 00:33:29.040
Matt: back in the day.

00:33:29.960 --> 00:33:30.980
Matt: Scott, you probably have more

00:33:31.840 --> 00:33:33.060
Matt: historical knowledge there than I do,

00:33:34.340 --> 00:33:35.360
Matt: since you've been doing web development

00:33:35.500 --> 00:33:37.180
Matt: longer than I have, but I feel like

00:33:37.180 --> 00:33:38.620
Matt: back in the day, it's like when bootstrap was

00:33:39.080 --> 00:33:40.260
Matt: maybe bootstrap 4 or something,

00:33:40.960 --> 00:33:41.700
Matt: when that came out,

00:33:43.060 --> 00:33:45.100
Matt: it's just like every website looked the same.

00:33:45.720 --> 00:33:47.260
Matt: The same could be said with material UI.

00:33:48.080 --> 00:33:49.040
Matt: I remember material UI

00:33:49.220 --> 00:33:50.080
Matt: being really big,

00:33:51.660 --> 00:33:52.700
Matt: and just like every site

00:33:53.040 --> 00:33:54.000
Matt: wanted to use material UI.

00:33:54.160 --> 00:33:56.540
Matt: I mean, heck, I built a few side projects

00:33:56.640 --> 00:33:58.660
Matt: and they all used MaterialUI and they all looked the same.

00:34:00.020 --> 00:34:00.840
Matt: And so I think it's just like,

00:34:01.320 --> 00:34:03.680
Matt: it's interesting that there's these waves of like,

00:34:04.820 --> 00:34:06.280
Matt: I don't know, like sort of waves of,

00:34:07.060 --> 00:34:08.080
Matt: you know, everyone looking the same

00:34:08.320 --> 00:34:10.179
Matt: instead of like this, you know,

00:34:10.260 --> 00:34:12.480
Matt: it just feels like there's like a lack of creativity

00:34:13.860 --> 00:34:16.020
Matt: happening around web design, web development.

00:34:17.780 --> 00:34:19.320
Matt: And I feel like that's just like,

00:34:19.480 --> 00:34:21.780
Matt: that is like an aspect of in-shittification.

00:34:21.899 --> 00:34:23.700
Matt: It's like that's the connection my brain is making.

00:34:25.420 --> 00:34:26.300
Matt: but yeah

00:34:26.530 --> 00:34:26.659
Matt: yeah

00:34:28.760 --> 00:34:30.600
Scott: yeah a lack of creativity is my

00:34:30.860 --> 00:34:32.379
Scott: thought but like it's almost like

00:34:33.660 --> 00:34:34.580
Scott: I'm trying to say

00:34:34.580 --> 00:34:36.560
Scott: I can define this but like a prescriptive

00:34:36.899 --> 00:34:38.639
Scott: solution to solve

00:34:38.919 --> 00:34:40.700
Scott: UI and every UI problem

00:34:41.179 --> 00:34:42.379
Scott: so like UI

00:34:42.639 --> 00:34:43.919
Scott: is used to be very

00:34:44.600 --> 00:34:46.520
Scott: for us at least very creative like how can we solve

00:34:46.550 --> 00:34:48.159
Scott: this problem like what can it do

00:34:48.720 --> 00:34:50.500
Scott: and now it's almost like okay here are the

00:34:50.600 --> 00:34:51.940
Scott: prescribed solutions

00:34:52.780 --> 00:34:55.720
Scott: within the boundaries, pick them.

00:34:56.210 --> 00:34:58.500
Scott: And not like, well, how can we try to break that mold?

00:34:58.550 --> 00:35:03.460
Scott: I remember a project real specific on the design system

00:35:03.680 --> 00:35:07.640
Scott: that I couldn't gain traction on where we had this concept

00:35:07.710 --> 00:35:13.700
Scott: of having sidebars and modals within different applications.

00:35:13.870 --> 00:35:15.860
Scott: I think I called it contextual modals.

00:35:16.100 --> 00:35:19.720
Scott: That was just the idea of having an application

00:35:19.790 --> 00:35:21.720
Scott: and an application and how could we handle that.

00:35:22.760 --> 00:35:28.820
Scott: and like that was so novel i think to sell because it was really just like okay we have an admin

00:35:29.000 --> 00:35:33.100
Scott: toolbar and we need to wrap it around a whole application how do we do it would solve something

00:35:33.260 --> 00:35:39.420
Scott: like that's problem today people would probably just be like make a chrome extension or something

00:35:40.020 --> 00:35:45.640
Scott: like don't have it in the application at all and maybe that's valid but what the purpose of this

00:35:45.660 --> 00:35:53.520
Scott: is just getting at trying to now break the mold of what people have normalized as prescriptive

00:35:53.740 --> 00:35:57.940
Scott: patterns to solve a problem has become harder than it once was.

00:35:59.580 --> 00:36:00.120
Matt: Yeah, for sure.

00:36:00.140 --> 00:36:05.220
Matt: I think there's this huge wave of momentum around that stuff.

00:36:05.420 --> 00:36:09.520
Matt: I mean, you can kind of see it internally at big companies also where a lot of companies

00:36:09.680 --> 00:36:12.640
Matt: invest in design systems because they...

00:36:13.400 --> 00:36:16.560
Matt: I don't know if it's that they don't value UI innovation,

00:36:18.230 --> 00:36:19.680
Matt: but I think it's just like there's this,

00:36:20.960 --> 00:36:25.040
Matt: they just say like, why rebuild the same UI over and over again?

00:36:25.230 --> 00:36:29.660
Matt: And at least they think it's the word same UI

00:36:29.800 --> 00:36:30.940
Matt: or that it's the same UI,

00:36:31.220 --> 00:36:33.680
Matt: but it's like, I think that,

00:36:35.580 --> 00:36:38.580
Matt: I think we've probably equally heard

00:36:38.960 --> 00:36:40.980
Matt: like a large number of questions around like,

00:36:41.440 --> 00:36:44.740
Matt: oh, if we're adopting a design system, does that mean that, you know, designers don't do new work?

00:36:44.980 --> 00:36:49.080
Matt: Or does that mean that engineers don't get to, like, explore different ideas?

00:36:50.340 --> 00:36:52.320
Matt: You know, it's like we're just, like, sort of doing cookie-cutter stuff.

00:36:52.500 --> 00:36:56.320
Matt: And I think it's like, you know, most organizations usually have an answer to that.

00:36:56.420 --> 00:36:57.960
Matt: And like, no, we're not stifling innovation.

00:36:58.180 --> 00:37:00.580
Matt: We're just, like, trying to ensure consistency.

00:37:01.780 --> 00:37:06.540
Matt: But I think it's an interesting, like, it's an interesting sort of, I don't know,

00:37:06.780 --> 00:37:11.060
Matt: momentum to like kind of combat and try to come up with something unique and

00:37:11.240 --> 00:37:14.420
Matt: innovative. Right. And it's like, now we're seeing it in, you know,

00:37:14.560 --> 00:37:16.780
Matt: not necessarily now in the open source space, but like,

00:37:17.080 --> 00:37:19.800
Matt: like it sort of happened like we were talking about bootstrap and material,

00:37:20.020 --> 00:37:24.960
Matt: but it's like, it's just like more like it's common outside of launch

00:37:25.100 --> 00:37:26.380
Matt: corporations also.

00:37:28.600 --> 00:37:29.600
Scott: Yeah. I almost feel like,

00:37:29.720 --> 00:37:34.000
Scott: so I remember being on the design system and having to sell the design system

00:37:34.640 --> 00:37:40.980
Scott: And the big sell for the design system is, hey, it allows us to move faster because we have consistency.

00:37:41.300 --> 00:37:44.380
Scott: Everything looks like the same, so it feels like one cohesive flow.

00:37:45.200 --> 00:37:53.800
Scott: And we have a starting point or we have pieces, and we can now just focus on the data layer so that the UI can be built quicker.

00:37:53.970 --> 00:37:56.340
Scott: And we're not rebuilding the same thing, so we're not wasting time.

00:37:56.860 --> 00:38:06.460
Scott: So it's almost like we did too good of a job as UI engineers at the time to sell the value of having building blocks ready to go.

00:38:07.900 --> 00:38:15.460
Scott: But the bigger problem is just that it seems like the more all design systems have the same pieces.

00:38:16.760 --> 00:38:18.300
Scott: And I try to do this all the time.

00:38:18.440 --> 00:38:27.080
Scott: I try to be like, okay, what should actually be a feature that is common enough that it's its own feature?

00:38:27.220 --> 00:38:28.920
Scott: I always talk about the Command-K palette.

00:38:29.460 --> 00:38:30.740
Scott: Sure, it's just a modal, maybe.

00:38:31.480 --> 00:38:36.580
Scott: But that itself, to me, is something we should just be a feature.

00:38:36.900 --> 00:38:40.280
Scott: You should be able to pull in a select dropdown and maybe a modal.

00:38:40.860 --> 00:38:41.380
Scott: Maybe not.

00:38:41.960 --> 00:38:47.400
Scott: But if you're building something like shadcn, where any new application might use it,

00:38:47.780 --> 00:38:52.460
Scott: I think that's almost like a common enough feature that should become compound enough to be created.

00:38:52.640 --> 00:38:54.400
Scott: But these are the things that I think like.

00:38:56.160 --> 00:38:59.220
Scott: Like shadcn doesn't really care about, like they do this unique thing, right?

00:38:59.800 --> 00:39:04.660
Scott: They're like, oh, here you take the code and download it and you own it now.

00:39:05.120 --> 00:39:09.000
Scott: So they almost they wipe their hands clean of like having to worry about things.

00:39:09.060 --> 00:39:13.600
Scott: I remember people talking about, well, what if like Radix updates and it destroys your UI?

00:39:14.180 --> 00:39:16.300
Scott: They don't care. Doesn't doesn't bother them.

00:39:16.460 --> 00:39:19.640
Scott: I mean, it hasn't, and I don't even think it's gonna.

00:39:20.620 --> 00:39:23.500
Scott: But the point being, now that ShadCN is so big too,

00:39:24.340 --> 00:39:29.380
Scott: likelihood that they have to make sure they don't break all sites becomes higher.

00:39:29.620 --> 00:39:34.820
Scott: But the point being, that simplification,

00:39:34.960 --> 00:39:37.960
Scott: it's almost an easier way than managing and owning dependencies,

00:39:39.080 --> 00:39:42.160
Scott: has just allowed them to be like, oh yeah, just copy the code.

00:39:42.420 --> 00:39:45.320
Scott: But they still get to staple and put their name on it.

00:39:45.840 --> 00:39:59.860
Scott: And it's just very interesting that something as simple as, like, just copy and paste the code, basically, or, you know, run a script that downloads the code into a file for you has become, like, the norm there.

00:40:01.100 --> 00:40:21.680
Scott: And I find that to be also like part of the problem, I guess, where like if everyone just does it this way and then we get used to it, it comes back to what I was just trying to say like slightly earlier is we just like accept that there are certain ways to solve a problem.

00:40:21.780 --> 00:40:30.340
Scott: And those are the only ways. And it just digs this hole deeper that it's harder for us to decide, let's try this new thing or sell this new thing.

00:40:30.810 --> 00:40:38.400
Scott: When people will point to and look at shadcn or I mean, like chakra or material and be like, no, this is the way these people do it.

00:40:38.650 --> 00:40:43.920
Scott: This is what works without using data. They just use the fact that it exists as the data.

00:40:44.320 --> 00:40:51.980
Scott: People in tech sometimes talk a lot about data, but they don't really use the numbers, which is something, a gripe of mine, I guess.

00:40:52.010 --> 00:40:52.460
Scott: Go ahead, Matt.

00:40:53.380 --> 00:40:53.640
Matt: Oh, yeah.

00:40:53.850 --> 00:40:57.660
Matt: I mean, you have to be data-driven in this time and age, of course.

00:40:58.840 --> 00:41:02.260
Matt: No, yeah, I think, yeah, it's interesting, right?

00:41:02.380 --> 00:41:16.580
Matt: I think the distribution format for shadcn maybe was, and I can't say for sure, but maybe the intention there was to say, hey, you have control to customize and to innovate.

00:41:17.620 --> 00:41:26.280
Matt: But I think a lot of people just took it at face value and said, okay, my components that I pulled in used to live in node modules, but now they live in ./ui.

00:41:28.100 --> 00:41:31.680
Matt: And I think most people just say, okay, that's just whatever.

00:41:32.220 --> 00:41:33.040
Matt: That's just a change.

00:41:34.420 --> 00:41:39.880
Matt: I don't think a lot of people maybe saw that as an opportunity to dig in and customize.

00:41:40.110 --> 00:41:43.700
Matt: I think there's still that maybe fear in the back of the head of,

00:41:44.190 --> 00:41:45.520
Matt: if we customize, now we can't upgrade.

00:41:47.180 --> 00:41:48.900
Matt: And they're maybe worried about that.

00:41:49.350 --> 00:41:54.160
Matt: I think in general, maybe this is more of a closing thought

00:41:55.460 --> 00:41:57.580
Matt: than maybe a continuation of the thread.

00:41:58.340 --> 00:42:01.500
Matt: I think really what we've been seeing is like this enshittification

00:42:01.620 --> 00:42:05.380
Matt: trend feels like it could be attributable to

00:42:06.380 --> 00:42:09.860
Matt: you know sort of lack of innovation or creativity and I think there's

00:42:10.020 --> 00:42:13.700
Matt: like a lot of maybe societal change like sort of pressures

00:42:13.860 --> 00:42:17.980
Matt: there to like sort of say yeah just like do the common thing like don't

00:42:18.160 --> 00:42:21.880
Matt: try to explore new avenues you know cultural aspects there and

00:42:22.000 --> 00:42:26.000
Matt: like whether it's like culture as a you know company or culture as

00:42:26.020 --> 00:42:34.700
Matt: like a country um and uh yeah i think it's just like i don't know it's people aren't willing to

00:42:34.820 --> 00:42:40.840
Matt: like sort of rethink or sort of maybe think from like um original like our you know first principles

00:42:41.640 --> 00:42:46.700
Matt: um i think that's like one of the core aspects right it's like even to that point of the blog

00:42:46.840 --> 00:42:54.040
Matt: post of like you know nobody cares it's like i think it's just like people are not um incentivized

00:42:54.040 --> 00:43:00.720
Matt: to apply that effort or to, you know, to like create new things

00:43:01.720 --> 00:43:04.900
Matt: and explore new boundaries and like provide value.

00:43:06.600 --> 00:43:06.980
Matt: Yeah, I don't know.

00:43:07.100 --> 00:43:11.820
Matt: It's a, yeah, I feel like I'm maybe like circling the drain on like that thread,

00:43:12.020 --> 00:43:15.620
Matt: but like maybe there's like some insight I haven't come to yet there.

00:43:15.960 --> 00:43:16.620
Matt: But yeah.

00:43:17.720 --> 00:43:21.560
Scott: I want to bring up the AI stuff too for a second because like, you know,

00:43:22.760 --> 00:43:25.960
Scott: Bricell also has an AI solution.

00:43:26.800 --> 00:43:28.300
Scott: The name is giving me right now.

00:43:28.960 --> 00:43:29.440
Scott: What is it, Matt?

00:43:29.540 --> 00:43:29.800
Matt: V0.

00:43:30.820 --> 00:43:31.500
Scott: V0, yeah.

00:43:32.320 --> 00:43:33.380
Scott: The most obvious name.

00:43:34.200 --> 00:43:37.440
Scott: V0, which basically spits out ChotCNUI for you.

00:43:37.740 --> 00:43:46.020
Scott: And the thing is, well and great, but it's not going to be creative or do something revolutionary and unique.

00:43:46.220 --> 00:43:49.460
Scott: It's only going to print out the patterns it's learned on or it's trained on.

00:43:50.020 --> 00:43:54.980
Scott: So that in itself, AI is another part of the problem.

00:43:55.660 --> 00:43:57.400
Scott: I mean, AI is going to have to be really smart.

00:43:57.680 --> 00:44:00.680
Scott: I don't think we're really that close.

00:44:01.360 --> 00:44:03.520
Scott: I know everything moves exponentially faster than we think,

00:44:03.700 --> 00:44:07.780
Scott: but I don't think we're really that close to an AI coming up with new ideas

00:44:08.730 --> 00:44:10.640
Scott: because that's just not what machine learning is.

00:44:11.320 --> 00:44:15.620
Scott: So we're going to see more of this, as I'll describe it, like soulless.

00:44:16.160 --> 00:44:17.060
Scott: This is how it works.

00:44:17.720 --> 00:44:28.500
Scott: And if people start thinking that AI is providing the right answers all of the time, it's stripping the creativity from what could happen.

00:44:30.120 --> 00:44:31.320
Matt: Yeah, yeah, 100%.

00:44:31.320 --> 00:44:34.820
Matt: Yeah, I think I saw a thread on this, like somewhere on Blue Sky.

00:44:35.060 --> 00:44:38.480
Matt: Maybe I'll try to dig it up and share it in our show notes.

00:44:38.640 --> 00:44:54.340
Matt: But yeah, there was like this thread about like, you know, a lot of companies these days are now asking themselves, you know, not just like, okay, if we're adopting an open source solution, it's like, does this, you know, like how many stars on GitHub does it have?

00:44:54.360 --> 00:44:55.240
Matt: Like how maintained is it?

00:44:55.400 --> 00:45:00.500
Matt: It's like the question they're asking is like, does AI like know about this, right?

00:45:00.560 --> 00:45:04.860
Matt: Like does ChatGPT understand this library, understand how it's used?

00:45:05.780 --> 00:45:08.320
Matt: And then that's like the decision making criteria.

00:45:08.440 --> 00:45:12.560
Matt: right like and it's like we've seen this in like in practice exactly in v0 right it's like the

00:45:13.360 --> 00:45:18.960
Matt: like next.js is going to become the react de facto framework because so many different ai models

00:45:19.200 --> 00:45:22.960
Matt: have been trained on it and it's just like you know sort of it's like a self-fulfilling prophecy

00:45:23.200 --> 00:45:29.000
Matt: where it's like ai recommends you use next yes to spin up a react app you use it it gets trained on

00:45:29.060 --> 00:45:34.160
Matt: that data for the next iteration and so you just keep getting into this like ingrained mode of like

00:45:34.580 --> 00:45:38.400
Matt: that is just going to become the way that we build stuff which is kind of unfortunate right it's like

00:45:39.180 --> 00:45:40.540
Matt: we're losing some of that

00:45:41.600 --> 00:45:42.420
Matt: niche part of

00:45:43.070 --> 00:45:44.940
Matt: open source community that's exploring new ideas

00:45:45.380 --> 00:45:45.620
Matt: because

00:45:47.740 --> 00:45:48.580
Matt: AI models

00:45:49.020 --> 00:45:50.820
Matt: don't know about it, and they're not trained on it,

00:45:50.820 --> 00:45:52.320
Matt: and they're not going to recommend it.

00:45:54.000 --> 00:45:54.620
Matt: Which, again,

00:45:55.140 --> 00:45:56.280
Matt: maybe leads to that in certification.

00:45:57.300 --> 00:45:58.740
Matt: It's like everything becomes

00:45:58.740 --> 00:46:00.680
Matt: the same, and you lose

00:46:01.840 --> 00:46:03.000
Matt: value or creativity,

00:46:03.170 --> 00:46:04.800
Matt: and it's just optimizations rather

00:46:04.800 --> 00:46:05.280
Matt: than innovations.

00:46:06.640 --> 00:46:07.820
Scott: It's so ingrained now.

00:46:07.860 --> 00:46:13.900
Scott: they deprecated create react app because they're just like use a framework at this point at this

00:46:14.040 --> 00:46:20.320
Scott: point it's like sure there's probably there's more but it's like use vite which is pretty much

00:46:20.530 --> 00:46:26.440
Scott: create react app with some speed and processor layers that help it and and then you or it's use

00:46:26.860 --> 00:46:32.720
Scott: next js or maybe there's a tan stack create i'm sure there tan stack start there i'm sure there

00:46:32.740 --> 00:46:39.300
Scott: is but it's like they don't even think that you should start at the bare bones react app anymore

00:46:39.920 --> 00:46:46.260
Scott: because there's so many formulated opinions on where you should start and that takes the

00:46:46.620 --> 00:46:55.660
Scott: creativity out of what you're building right there

00:46:46.620 --> 00:46:55.660
Matt: yeah 100 um yeah yeah i think yeah i mean i don't

00:46:55.720 --> 00:46:59.580
Matt: want to necessarily like cut off any credit any further thoughts you might have on the topic but

00:46:59.600 --> 00:47:03.440
Matt: I think we've kind of like talked about the core things that we had on the

00:47:03.560 --> 00:47:06.440
Matt: docket for today. Yeah.

00:47:06.720 --> 00:47:10.500
Matt: I, I, I, I fear that our episode is going to be very like doomer.

00:47:11.340 --> 00:47:12.640
Matt: You know, like I think this episode,

00:47:12.690 --> 00:47:15.980
Matt: the previous episode have been very like sort of doom or like negative outlook.

00:47:18.400 --> 00:47:19.120
Matt: Yeah. Yeah.

00:47:19.840 --> 00:47:22.520
Matt: We need to think about how we can get more maybe optimism into,

00:47:23.700 --> 00:47:26.220
Matt: into these discussions, but I think it's interesting for like,

00:47:26.360 --> 00:47:27.420
Matt: it's worth us digging into them.

00:47:29.300 --> 00:47:41.100
Scott: I don't know if, yeah, it may be like, maybe it seems that doomish in the sense that like we enjoyed some of these things and they're being taken away.

00:47:43.640 --> 00:47:52.260
Scott: But I don't think that's, I don't think that like that doesn't mean we can't still be creative and still try to push these boundaries in our own time.

00:47:52.360 --> 00:48:06.100
Scott: I did tell you as a silver lining, I've been trying to create my own library, and I want to create a Command K palette as a component, and maybe that'll be a way to get some momentum behind something like that.

00:48:06.430 --> 00:48:11.040
Scott: I think that there is opportunity for new patterns to continue to emerge.

00:48:12.800 --> 00:48:15.580
Scott: They will become maybe standardized at some point.

00:48:16.640 --> 00:48:17.920
Scott: And new things will definitely happen.

00:48:18.020 --> 00:48:25.360
Scott: And it's just that we are no longer trying to focus part of our job to discover them, which is maybe a little bit of a bummer.

00:48:25.460 --> 00:48:28.840
Scott: But I wouldn't I wouldn't say it's as much doomerism.

00:48:29.700 --> 00:48:36.200
Scott: I do get that, like, it really it's AI that is throwing the wrench into, like, what will our job become?

00:48:36.700 --> 00:48:37.860
Scott: It's very interesting.

00:48:39.000 --> 00:48:40.780
Scott: I did have some other thoughts on enshittification.

00:48:41.120 --> 00:48:47.660
Scott: I can't remember them right now, but it's very interesting that it now seems like corporations are like, oh, yeah, use AI and use it as much as possible.

00:48:48.180 --> 00:48:49.220
Scott: and let's move faster.

00:48:50.820 --> 00:48:51.660
Scott: That is just definitely

00:48:53.500 --> 00:48:54.440
Scott: an interesting future.

00:48:55.580 --> 00:48:57.960
Matt: Yeah, I think we probably have a whole other episode

00:48:57.980 --> 00:48:59.180
Matt: to dig into in terms of like,

00:49:00.740 --> 00:49:02.320
Matt: there's this sort of,

00:49:04.500 --> 00:49:06.120
Matt: yeah, I don't know the word for it,

00:49:06.200 --> 00:49:08.540
Matt: but like sort of a spectrum of like companies

00:49:08.880 --> 00:49:11.940
Matt: where they say that they're AI first or AI native.

00:49:12.180 --> 00:49:13.760
Matt: And I think there's sort of like,

00:49:13.980 --> 00:49:15.860
Matt: good and bad outcomes of that.

00:49:16.900 --> 00:49:19.800
Matt: And yeah, maybe we save that for the next episode.

00:49:20.460 --> 00:49:20.660
Matt: We'll see.

00:49:22.140 --> 00:49:22.320
Scott: Yeah.

00:49:23.480 --> 00:49:26.920
Scott: But like overall, this concept of enshittification,

00:49:27.340 --> 00:49:31.160
Scott: like I think there's like a lot that makes it occur.

00:49:31.640 --> 00:49:35.220
Scott: But I guess my final thoughts on it is,

00:49:35.340 --> 00:49:40.060
Scott: is it's a lot of the times it's like a platform gets so big

00:49:40.360 --> 00:49:44.100
Scott: and it's served its purpose that it stops innovating.

00:49:44.400 --> 00:49:48.540
Scott: or it's like you said, kind of stops, loses its mission of like,

00:49:48.800 --> 00:49:52.920
Scott: what were we here for in the first place? Like Facebook, um,

00:49:53.420 --> 00:49:57.940
Scott: for a long time added features, but these apps and these,

00:49:58.560 --> 00:50:02.360
Scott: especially the bigger apps, I think that they're where it's heading.

00:50:03.560 --> 00:50:05.840
Scott: Is there just be going to try to become everything apps?

00:50:06.040 --> 00:50:10.040
Scott: Like China has everything apps and like Facebook is a good example of that

00:50:10.140 --> 00:50:12.620
Scott: because it's a place to like connect with friends,

00:50:13.260 --> 00:50:19.260
Scott: but now it's a messenger app too but then also they own a messenger app but it's also got stories like

00:50:19.700 --> 00:50:25.460
Scott: um snapchat but also they own instagram and that adds stories like snapchat it's they're they're

00:50:25.540 --> 00:50:35.080
Scott: like these everything apps um you can do what you do on twitter or um blue sky on on there by posting

00:50:35.360 --> 00:50:42.180
Scott: you can do all of it so i think i mean i don't think it's the greatest but i think they're going

00:50:42.200 --> 00:50:52.100
Scott: to start to try to have currencies or um find ways where they can just be an app that's a super app

00:50:52.560 --> 00:50:57.560
Scott: um and i'm not saying that that's the right answer i actually don't think that's the right answer

00:50:57.800 --> 00:51:03.780
Scott: because that leads us down a path of like them have um these companies have this maybe this is

00:51:03.860 --> 00:51:09.980
Scott: doomerism again but companies having so much data about us that um they can start to market up prices

00:51:10.000 --> 00:51:12.220
Scott: based on the things that they know we need.

00:51:15.880 --> 00:51:20.980
Scott: But also, the point being that when they run out of innovation,

00:51:22.700 --> 00:51:25.900
Scott: they start to just saying, what's a whole new market we can get into?

00:51:26.310 --> 00:51:32.440
Scott: Or they don't look at how can we further serve the purpose of our original intent.

00:51:33.140 --> 00:51:35.240
Scott: And that's where these apps kind of have their downfall.

00:51:36.520 --> 00:51:40.500
Scott: Trying to think of like an example like Spotify I had from the beginning.

00:51:40.570 --> 00:51:44.580
Scott: I don't think that we're going to see it like have a downfall,

00:51:44.760 --> 00:51:47.620
Scott: but it is disheartening that you pay for this service.

00:51:48.560 --> 00:51:49.100
Scott: It's gone up.

00:51:49.280 --> 00:51:53.400
Scott: I've had it for probably like 12, 15 years.

00:51:53.560 --> 00:51:54.440
Scott: It's a long time now.

00:51:55.040 --> 00:51:57.160
Scott: And, you know, it's not too, too much more.

00:51:57.260 --> 00:52:03.260
Scott: It's like $13 a month from its original $7, $8, $10.

00:52:03.900 --> 00:52:08.300
Scott: But at the same time, it's very disheartening to see it add ads in it.

00:52:08.740 --> 00:52:12.600
Scott: It is adding video, so they are still trying to innovate on it.

00:52:13.000 --> 00:52:14.240
Scott: I talked about the jam feature,

00:52:14.900 --> 00:52:17.420
Scott: but it's when an application becomes a place

00:52:17.460 --> 00:52:20.220
Scott: that just seems to be losing its quality.

00:52:20.780 --> 00:52:22.640
Scott: Like Amazon website, for example,

00:52:24.440 --> 00:52:27.200
Scott: was the best place to get products cheap,

00:52:27.580 --> 00:52:28.720
Scott: and then they started raising their prices,

00:52:28.960 --> 00:52:30.120
Scott: and now it's just about convenience.

00:52:30.900 --> 00:52:32.880
Scott: And then they stopped guaranteeing two days.

00:52:33.320 --> 00:52:46.820
Scott: And then all of, and this is an episode in itself that we still want to talk about, but then all of the product display pages became about monetizing every product that they own and less about having a good quality clarity around the product that you're interested in.

00:52:47.440 --> 00:53:02.540
Scott: All of that is like, while Amazon is still like a pretty good place to shop, it shows you that like it's a downward trend of like they cared really much about the customer until they had all the customers they could pretty much have.

00:53:03.160 --> 00:53:09.320
Scott: And now they have these customers in their pocket a little bit with things like, you know, people are paying yearly for Amazon subscriptions.

00:53:10.440 --> 00:53:18.060
Scott: So now they just work a little less important about the customer and more about like, how can we sell more, make more money and monetize that?

00:53:19.480 --> 00:53:32.220
Scott: So if they pulled back a little bit and built a UI, and I guess it's very UI centric is why we talked about Shotsian, and built a UI that helped the customer almost more like theater to find the thing that they needed.

00:53:32.500 --> 00:53:35.980
Scott: I hear so many people say like, oh, yeah, Amazon sucks when I search.

00:53:36.200 --> 00:53:39.080
Scott: I can't find the thing I need for multiple pages.

00:53:40.510 --> 00:53:47.040
Scott: Or I know that I and some friends of mine are like, oh, you need to look up the reviews and make sure the reviews aren't faked.

00:53:47.540 --> 00:53:54.080
Scott: Or you need to look at this other website that exists to look at the price trend and see how much they've raised or lowered the price.

00:53:54.440 --> 00:53:59.460
Scott: When those kinds of things start infiltrating your app, it's no longer about giving the user the best experience.

00:53:59.980 --> 00:54:01.700
Scott: And it's more about just monetizing.

00:54:02.220 --> 00:54:05.320
Scott: And that's a downward trend that you don't want to be a part of.

00:54:07.380 --> 00:54:10.540
Matt: Capitalism just leads towards this.

00:54:12.280 --> 00:54:16.800
Matt: Trying to get the most out of, or maybe extreme capitalism,

00:54:17.180 --> 00:54:19.360
Matt: trying to get the most value out of every single customer

00:54:20.500 --> 00:54:25.980
Matt: just leads to this case where it's not a great customer experience,

00:54:26.240 --> 00:54:29.400
Matt: but it's a good enough experience that they're going to convert.

00:54:30.060 --> 00:54:31.060
Matt: and the company knows that.

00:54:32.720 --> 00:54:36.220
Matt: And so they build up a moat that is like,

00:54:36.460 --> 00:54:38.060
Matt: yeah, exactly like your convenience

00:54:38.260 --> 00:54:41.260
Matt: or you've shopped on Amazon for long enough

00:54:41.420 --> 00:54:43.860
Matt: that Amazon knows that you're not going to go

00:54:43.920 --> 00:54:45.460
Matt: to some other site to start shopping at

00:54:45.700 --> 00:54:47.500
Matt: just because all the convenience

00:54:47.660 --> 00:54:48.460
Matt: that they've already built up.

00:54:49.200 --> 00:54:51.340
Matt: And so they just know that they don't need

00:54:51.440 --> 00:54:52.760
Matt: to make it 10x better.

00:54:52.960 --> 00:54:58.280
Matt: They just need to make it not 10x worse or something.

00:54:59.580 --> 00:55:03.800
Scott: or just better than the next competitor in some form or fashion that doesn't have to be

00:55:04.000 --> 00:55:08.140
Scott: in the facet of ui and i also just want to point out we know that amazon you know makes their money

00:55:08.340 --> 00:55:16.260
Scott: through aws and that's another example of like a really bad ui it's not good uh but it's it's

00:55:16.260 --> 00:55:23.240
Scott: like cheaper than most and they're able to do that because they have that um capital essentially

00:55:23.260 --> 00:55:29.780
Scott: to provide services and they did it at the right time um but but it's again like

00:55:30.900 --> 00:55:36.100
Scott: i've been told they have like seven or eight design systems and when you look at aws it shows

00:55:36.620 --> 00:55:41.980
Scott: half the page is white then all of a sudden it's dark mode and then it's just not that that's not

00:55:41.980 --> 00:55:48.660
Scott: a that's a no-no for users like we need to uh i don't know why there's a design system at that

00:55:48.680 --> 00:55:53.400
Scott: point if the themes are completely different and the whole point is consistency having seven or

00:55:53.410 --> 00:56:01.700
Scott: eight of that is not a strong argument to me but um i digress all right yeah i think we've i think

00:56:01.720 --> 00:56:07.100
Matt: we've probably you know talked about this topic uh a decent amount maybe more than we maybe

00:56:07.420 --> 00:56:12.420
Matt: originally intended um we'll come back to it though yeah yeah we might yeah we might come back to in

00:56:12.440 --> 00:56:19.900
Matt: the future. Scott, what's new with you? What's a new and exciting thing have you been looking at

00:56:19.920 --> 00:56:26.040
Matt: over the past few weeks that you want to talk about?

00:56:19.920 --> 00:56:26.040
Scott: Yeah. So before I talk about that, we did

00:56:26.120 --> 00:56:31.460
Scott: an experiment from our last podcast about using and not using AI. We're going to hold off on

00:56:31.560 --> 00:56:37.480
Scott: talking about that until we have Dillon back. Dillon wasn't able to make it today. So we will talk

00:56:38.520 --> 00:56:41.120
Scott: about that in a future episode.

00:56:42.200 --> 00:56:43.880
Scott: The reason why I said that beforehand

00:56:45.500 --> 00:56:49.880
Scott: is because I started to dig into Avante for any of them.

00:56:49.960 --> 00:56:51.360
Scott: I've been watching some videos on it,

00:56:51.480 --> 00:56:53.020
Scott: and it is more copilot,

00:56:53.260 --> 00:56:55.780
Scott: sorry, more cursor AI-like than I thought,

00:56:56.000 --> 00:56:57.720
Scott: and I started to configure it better,

00:56:57.840 --> 00:56:58.960
Scott: and it has been stronger for me.

00:56:59.460 --> 00:57:01.020
Scott: And I want to say that it's helped me

00:57:02.360 --> 00:57:06.460
Scott: in my cop-out answer in my Golang quest

00:57:06.480 --> 00:57:07.980
Scott: of becoming a better Go engineer.

00:57:09.140 --> 00:57:11.900
Scott: But I've been using that,

00:57:11.950 --> 00:57:14.460
Scott: but also getting stronger at Go,

00:57:14.660 --> 00:57:15.900
Scott: where I'm starting to feel a lot better

00:57:16.140 --> 00:57:17.760
Scott: about my Go skill set.

00:57:18.340 --> 00:57:19.640
Scott: One of the things I haven't done yet,

00:57:19.680 --> 00:57:21.560
Scott: Bo was telling Matt actually last weekend

00:57:22.500 --> 00:57:26.020
Scott: that I'm really into just starting a Go project,

00:57:26.960 --> 00:57:28.560
Scott: or what I want to do is I want to build

00:57:28.870 --> 00:57:30.340
Scott: a backend service in Node,

00:57:30.980 --> 00:57:33.840
Scott: and then, you're looking at me funny, Matt.

00:57:34.240 --> 00:57:35.880
Scott: I want to build a backend service in Node

00:57:35.900 --> 00:57:37.060
Scott: and then refactor it to go.

00:57:37.740 --> 00:57:41.080
Scott: I think that would be a great exercise for me

00:57:41.200 --> 00:57:44.260
Scott: to continue to grow my backend skill set.

00:57:45.000 --> 00:57:46.980
Scott: So I've been getting a lot more comfortable, though,

00:57:47.100 --> 00:57:49.120
Scott: with the language and just understanding what's possible

00:57:49.290 --> 00:57:50.140
Scott: and how everything works.

00:57:50.720 --> 00:57:52.820
Scott: And basically the differences between that and JavaScript.

00:57:53.160 --> 00:57:54.860
Scott: So it's been a ride a little bit,

00:57:55.300 --> 00:57:58.540
Scott: but I'm starting to feel confident in the language.

00:57:58.760 --> 00:58:00.540
Scott: It's been maybe three months.

00:58:00.820 --> 00:58:04.200
Scott: So that's a real big win for me.

00:58:05.420 --> 00:58:06.800
Scott: Matt, what's up with you?

00:58:09.580 --> 00:58:10.600
Matt: Oh, yeah, that's good call-outs.

00:58:11.200 --> 00:58:13.860
Matt: I've been digging more and more into,

00:58:14.180 --> 00:58:15.280
Matt: so, like, even though, yeah,

00:58:15.340 --> 00:58:16.480
Matt: our last episode talked about, like,

00:58:16.720 --> 00:58:17.780
Matt: not using AI as much,

00:58:17.900 --> 00:58:19.340
Matt: I've actually been digging into using AI

00:58:19.380 --> 00:58:19.960
Matt: a little bit more.

00:58:21.260 --> 00:58:21.660
Scott: Womp Womp.

00:58:23.920 --> 00:58:26.160
Matt: Specifically using the, like,

00:58:26.160 --> 00:58:28.740
Matt: the Composer agent feature in Cursor.

00:58:29.840 --> 00:58:31.180
Matt: But I've also been testing out this thing,

00:58:32.200 --> 00:58:32.860
Matt: like, briefly.

00:58:34.040 --> 00:58:38.420
Matt: That's called, what is it, Codename Goose,

00:58:39.380 --> 00:58:44.920
Matt: which is apparently sort of a thing open source by block, I think,

00:58:45.340 --> 00:58:45.840
Matt: maybe by Square.

00:58:48.060 --> 00:58:50.400
Matt: And it's this pretty cool thing where it, like,

00:58:51.900 --> 00:58:55.740
Matt: it, like, infers functionality that's available on your computer

00:58:56.820 --> 00:58:59.300
Matt: and, like, can, like, sort of build up its own functions

00:58:59.600 --> 00:59:02.340
Matt: and, like, agentic sort of resources or tools.

00:59:03.000 --> 00:59:06.540
Matt: It's using the model context protocol from Anthropic, which is pretty neat.

00:59:08.320 --> 00:59:18.060
Matt: But it's as this maybe slightly scary onboarding where it's just like you put it up and then it starts spamming you with things saying like,

00:59:18.060 --> 00:59:20.620
Matt: oh, do you want to allow this thing access to your files?

00:59:20.700 --> 00:59:22.140
Matt: Do you want to allow this thing access to the network?

00:59:22.260 --> 00:59:24.280
Matt: Do you want to allow this thing access to whatever?

00:59:24.480 --> 00:59:28.160
Matt: And it's a little bit scary, but I kind of YOLO'd it.

00:59:28.500 --> 00:59:29.360
Matt: And it's pretty neat.

00:59:30.360 --> 00:59:32.220
Matt: It could do some pretty complex stuff.

00:59:34.020 --> 00:59:35.760
Matt: so I'll have a link to that in the show notes

00:59:35.920 --> 00:59:37.820
Matt: but it's something I've been messing around with

00:59:38.900 --> 00:59:41.340
Matt: along with Cursor's agent stuff

00:59:41.960 --> 00:59:42.820
Matt: and then also I've been leveraging

00:59:44.880 --> 00:59:47.780
Matt: this MacWhisper or SuperWhisperer I think is what I installed

00:59:48.400 --> 00:59:50.360
Matt: that allows me to talk to Cursor

00:59:50.700 --> 00:59:53.560
Matt: and dictate the prompt to Cursor

00:59:53.560 --> 00:59:55.180
Matt: and then Cursor then goes off and does the work

00:59:55.700 --> 00:59:56.920
Matt: which is actually really neat

00:59:58.200 --> 01:00:01.360
Matt: I won't even have to have a keyboard, I can just talk to my computer

01:00:01.380 --> 01:00:03.100
Matt: and have it do the work for me.

01:00:05.040 --> 01:00:05.660
Scott: That's pretty cool.

01:00:06.440 --> 01:00:07.720
Scott: I'm excited to see that link.

01:00:08.040 --> 01:00:10.080
Scott: I haven't checked anything like that out.

01:00:10.860 --> 01:00:13.760
Scott: I do want to make one note about, you know,

01:00:14.180 --> 01:00:16.480
Scott: you kind of have me thinking about this.

01:00:16.480 --> 01:00:18.480
Scott: We've been a little bit doomish about AI,

01:00:18.660 --> 01:00:21.000
Scott: but at the same time, you've been leaning in.

01:00:21.120 --> 01:00:22.980
Scott: And I actually, I do think it's the right thing

01:00:23.080 --> 01:00:23.900
Scott: to continue to lean in.

01:00:24.420 --> 01:00:26.160
Scott: I just think that as a software engineer,

01:00:27.380 --> 01:00:31.340
Scott: my advice to you, was it that great guru

01:00:31.360 --> 01:00:32.520
Scott: My advice to you.

01:00:33.420 --> 01:00:34.380
Scott: My advice is to you.

01:00:35.190 --> 01:00:37.520
Scott: To continue, Matt has no clue what I'm talking about.

01:00:37.520 --> 01:00:38.960
Scott: I made a reference that just dated me.

01:00:40.200 --> 01:00:41.940
Scott: Anyway, gang star guru.

01:00:42.360 --> 01:00:50.420
Scott: Anyway, what I would recommend is that, yeah, I think it's definitely fine to use AI,

01:00:50.630 --> 01:00:55.140
Scott: but you want to make sure when you prompt the AI that you don't just spit everything out

01:00:56.490 --> 01:00:59.300
Scott: and copy and paste it and not understand how it works.

01:00:59.560 --> 01:01:05.240
Scott: You need to find one thing about it that you can understand a little better.

01:01:05.560 --> 01:01:06.640
Scott: I'm really speaking about languages.

01:01:06.820 --> 01:01:07.600
Scott: Maybe you don't know that well.

01:01:08.400 --> 01:01:11.180
Scott: If you know JavaScript well, or sorry, any language well,

01:01:11.320 --> 01:01:12.400
Scott: and you're using it to spit that out,

01:01:13.320 --> 01:01:15.080
Scott: it's a lot easier for you to just look at it and say,

01:01:15.320 --> 01:01:16.420
Scott: okay, I know what it's doing.

01:01:16.700 --> 01:01:17.320
Scott: That makes sense.

01:01:17.660 --> 01:01:18.340
Scott: Here's an optimization.

01:01:18.980 --> 01:01:19.800
Scott: But when you don't know it,

01:01:19.800 --> 01:01:23.000
Scott: you want to make sure that you can break down at least one piece of it

01:01:23.180 --> 01:01:26.460
Scott: and learn from it so that you can continue to keep up with it.

01:01:27.500 --> 01:01:45.700
Scott: That's the route I have been taking, but also just like doing some things and some percentage capacity to just remind yourself, you know, how to solve a problem because it's so easy to just be like, wait, how would I do this?

01:01:46.000 --> 01:01:48.760
Scott: And then feel like you have this imposter syndrome that you don't know how to solve a problem.

01:01:49.400 --> 01:02:08.360
Scott: So always take on easy tasks and small tasks or think about what you're asking it to do and how you might solve it before you just take AI's word for it so that you're not just mindlessly connecting dots and you don't feel like you are worthless.

01:02:09.580 --> 01:02:10.380
Scott: Matt, you have thoughts on that?

01:02:10.980 --> 01:02:11.300
Matt: Oh, yeah.

01:02:11.940 --> 01:02:12.860
Matt: Maybe a little bit ironic.

01:02:13.200 --> 01:02:15.320
Matt: I think I've been maybe doing the opposite.

01:02:15.620 --> 01:02:17.400
Matt: Well, in certain cases.

01:02:17.660 --> 01:02:17.800
Matt: Yeah.

01:02:18.000 --> 01:02:24.380
Matt: There are certain projects where I'm like, yes, I want to be more involved and I want to know what the code's doing.

01:02:24.430 --> 01:02:25.740
Matt: I want to write it myself, maybe.

01:02:26.580 --> 01:02:29.420
Matt: But other cases where it's just like I've been YOLO-ing it.

01:02:29.720 --> 01:02:37.040
Matt: And I was actually talking to a friend of the pod that we both know, Joe, about this sort of topic.

01:02:37.150 --> 01:02:44.780
Matt: And he was saying that also that he's seen the trend of people want to be more involved, more hands-on with the AI recommendations, like the agentic sort of updates.

01:02:46.180 --> 01:02:48.400
Matt: And I was like telling them like, actually, I just want the opposite.

01:02:48.490 --> 01:02:56.120
Matt: I just want to like tell something like go do this task and then not check back on it until it's like done.

01:02:56.530 --> 01:03:05.120
Matt: Basically, like I want us to get to the point where I can like, you know, just tell tell an agent to like do a change and then have it deploy it and validate it and whatnot.

01:03:05.480 --> 01:03:08.200
Matt: And like I have, you know, I've just been sitting back doing nothing.

01:03:10.459 --> 01:03:11.780
Matt: So so I don't know.

01:03:11.900 --> 01:03:14.760
Matt: It's a maybe different perspectives there.

01:03:14.920 --> 01:03:16.080
Matt: But yeah.

01:03:18.340 --> 01:03:20.740
Scott: Yeah, well, you write code every day of the week.

01:03:20.940 --> 01:03:23.220
Scott: So maybe for you, you feel like you're still in it.

01:03:24.800 --> 01:03:27.220
Scott: But yeah, I know I do the same thing.

01:03:27.460 --> 01:03:29.400
Scott: Like, I don't think that it's wrong to do that.

01:03:29.430 --> 01:03:30.480
Scott: I think you just need to.

01:03:32.680 --> 01:03:37.340
Scott: One of the things I'm big on is taking time out of your day to make sure you're learning something new.

01:03:39.060 --> 01:03:41.680
Scott: Whether that's something that is completely unrelated to work.

01:03:42.620 --> 01:03:43.860
Scott: But it just needs to excite you.

01:03:44.320 --> 01:03:45.160
Scott: but it helps you with your career,

01:03:45.980 --> 01:03:47.940
Scott: which is one of the reasons why I use Neovim.

01:03:47.940 --> 01:03:50.100
Scott: I know I've talked about this before probably,

01:03:52.020 --> 01:03:54.880
Scott: but you want to make sure that you're getting something

01:03:56.660 --> 01:03:58.260
Scott: out of the work you do

01:03:59.020 --> 01:04:00.180
Scott: and that it's coming back to you

01:04:00.320 --> 01:04:02.240
Scott: that's growing your career in any way, shape, or form

01:04:02.320 --> 01:04:04.140
Scott: that makes you feel stronger.

01:04:04.560 --> 01:04:06.600
Scott: I think that if you want to...

01:04:06.640 --> 01:04:08.120
Scott: I also do what you do, Matt.

01:04:08.120 --> 01:04:13.820
Scott: I eat stuff in the AI and just take it and paste it and go.

01:04:13.860 --> 01:04:17.680
Scott: sometimes you work on tasks that aren't that exciting and sometimes that's what you do but

01:04:18.240 --> 01:04:24.760
Scott: you want to try to make sure that you are aware um and that something engages you and makes you feel

01:04:24.960 --> 01:04:30.640
Scott: like like you are you are solving problems too so you still have that sense of ownership or

01:04:31.160 --> 01:04:37.859
Scott: sense of completeness i guess when you're done with with your work

01:04:31.160 --> 01:04:37.859
Matt: yeah for sure all right should

01:04:37.880 --> 01:04:39.340
Matt: Should we call it there?

01:04:40.400 --> 01:04:40.820
Scott: All right.

01:04:41.600 --> 01:04:41.840
Scott: Peace!

01:04:44.400 --> 01:04:45.400
Matt: See you guys next week.

01:04:46.640 --> 01:04:46.960
Take care.

