WEBVTT

00:00:12.280 --> 00:00:16.139
<v Scott>Welcome to the Bikeshed Podcast, the show where we talk all things software engineering,

00:00:16.820 --> 00:00:21.480
<v Scott>agree on a topic ahead of time, and then immediately abandon it, just as the name suggests.

00:00:22.140 --> 00:00:26.980
<v Scott>I'm your co-host, a man who just cooked up his next feature with Claude, ran format on save,

00:00:27.360 --> 00:00:29.640
<v Scott>and confidently called it his own, Scott Kaye.

00:00:30.520 --> 00:00:32.640
<v Scott>And alongside with me are my co-hosts.

00:00:33.340 --> 00:00:39.160
<v Scott>Even during his vacation, his GitHub contribution graph lights up like the brightest house on the block at Christmas.

00:00:40.060 --> 00:00:43.820
<v Scott>He's the reason your mom has 47 items in her online shopping cart.

00:00:44.560 --> 00:00:47.620
<v Scott>He's not responsible for the 45-second My Account load time.

00:00:48.360 --> 00:00:53.100
<v Scott>And if your company just migrated to his open source tools, that load time might actually be reasonable.

00:00:53.880 --> 00:00:54.460
<v Scott>Matt Hamlin.

00:00:55.510 --> 00:00:56.440
<v Scott>And my other co-host.

00:00:57.300 --> 00:01:02.840
<v Scott>A man whose takes are so hot that Cloudflare pays him a retainer just to keep their CDN cash warm.

00:01:03.820 --> 00:01:07.360
<v Scott>You may have heard he recently brought down Verizon by simply asking,

00:01:08.200 --> 00:01:09.460
<v Scott>yeah, but why though?

00:01:10.380 --> 00:01:12.380
<v Scott>Dillon, spicy, take, Curry.

00:01:13.500 --> 00:01:15.920
<v Scott>Matt, Dillon, what's on the agenda today?

00:01:18.020 --> 00:01:21.300
<v Matt>This week we're going to cover some of our 2026 predictions.

00:01:22.060 --> 00:01:26.940
<v Matt>Yeah, the three of us spent maybe the past week outlining some predictions

00:01:26.960 --> 00:01:31.720
<v Matt>that we have for the year or some you know maybe hopeful predictions maybe pessimistic predictions

00:01:32.320 --> 00:01:37.140
<v Matt>that we expect to see pan out over the next year so yeah so without further ado i guess like

00:01:37.660 --> 00:01:41.260
<v Matt>let's dig in Dillon do you want to share your first prediction

00:01:43.240 --> 00:01:49.440
<v Dillon>Sure we're probably going to have some like overlap but let's see what happens um all right my first

00:01:49.490 --> 00:01:56.920
<v Dillon>one is and i feel like if you work at a certain place this might already exist but autonomous

00:01:56.960 --> 00:02:02.580
<v Dillon>coding agents are going to find their way into our workflows single prompt to deploy that's how

00:02:02.580 --> 00:02:09.220
<v Dillon>i'm thinking about it so i'll just like not have to write code it anymore and it will just like

00:02:09.220 --> 00:02:09.220
<v Dillon>the entire flow will be automated

00:02:09.221 --> 00:02:09.221
<v Matt>Are you imagining this like getting more widely adopted because i

00:02:16.100 --> 00:02:22.400
<v Matt>feel like it's gotten like for smaller sort of side projects things like v0 and bolt have like

00:02:23.000 --> 00:02:29.860
<v Matt>done decently well at some of this but maybe you're thinking of like or you know i guess like

00:02:29.890 --> 00:02:33.160
<v Matt>some of those tools it's like you need to prompt it a decent amount like you have to go through a

00:02:33.160 --> 00:02:33.160
<v Matt>few iterations before you're like like actually have something ready to go

00:02:33.161 --> 00:02:33.161
<v Dillon>yeah i think of it as

00:02:39.360 --> 00:02:47.640
<v Dillon>more like teams will product teams will start to just write new features using this without an

00:02:47.660 --> 00:02:47.660
<v Dillon>engineer and they'll just let it cook

00:02:47.661 --> 00:02:47.661
<v Scott>i hope not actually i hope so because then when it's when

00:02:55.860 --> 00:03:02.780
<v Scott>it's absolute crap and the enshittification of the web gets worse the companies that realize they

00:03:02.960 --> 00:03:07.320
<v Scott>still need their engineers and they still need to think through problems those are the companies

00:03:07.560 --> 00:03:11.320
<v Scott>that we're going to see pull to the forefront this was actually technically one of my predictions

00:03:12.320 --> 00:03:13.840
<v Scott>but here it's now to take.

00:03:14.440 --> 00:03:15.960
<v Scott>Since I came armed with predictions,

00:03:16.240 --> 00:03:18.060
<v Scott>I'm strapped with predictions.

00:03:18.330 --> 00:03:19.380
<v Scott>I don't need this many predictions.

00:03:19.970 --> 00:03:20.700
<v Scott>I'm just going to say,

00:03:22.380 --> 00:03:24.420
<v Scott>it's kind of a continuation of what you're saying, Dillon,

00:03:24.640 --> 00:03:29.160
<v Scott>but I think basically that companies

00:03:29.560 --> 00:03:32.340
<v Scott>that we're going to see in 2026,

00:03:32.580 --> 00:03:36.300
<v Scott>companies that actually care about the end results

00:03:36.580 --> 00:03:38.580
<v Scott>start to pull forward in front of other companies.

00:03:38.900 --> 00:03:41.120
<v Scott>We talk a lot on this podcast about enshittification,

00:03:41.820 --> 00:03:51.120
<v Scott>I think in 2026, this is the year we're going to see which companies actually put more effort in their products than just we use AI.

00:03:53.459 --> 00:03:56.240
<v Matt>In terms of likeliness, maybe we'll do it on a scale of 10.

00:03:56.460 --> 00:03:56.860
<v Matt>I don't know.

00:03:57.260 --> 00:04:00.500
<v Matt>Like 10 being high, like it is absolute truth that's going to happen.

00:04:00.580 --> 00:04:02.640
<v Matt>And one is like probably never going to happen.

00:04:03.520 --> 00:04:05.240
<v Matt>I think we'll definitely see companies doing that.

00:04:06.040 --> 00:04:10.540
<v Matt>And so from that lens, like I think it's like maybe a six or a seven.

00:04:11.560 --> 00:04:15.840
<v Matt>in my mind. Scott, what's your rating on Dillon's prediction?

00:04:17.959 --> 00:04:22.680
<v Scott>I'm thinking like five. I think because I think this is like going to be like a 50-50. It's very

00:04:22.960 --> 00:04:27.480
<v Scott>similar to what you're saying, Matt. I think people will take those risks. But if we're using like

00:04:28.160 --> 00:04:35.439
<v Scott>my take into reacting to Dillon's, I think we're going to see that absolutely backfire for some

00:04:35.460 --> 00:04:40.840
<v Scott>companies and we're going to see other companies who don't just go off like they might go all in

00:04:40.850 --> 00:04:45.380
<v Scott>on tools but i think we're going to start to see like what are what are the things beyond the code

00:04:45.420 --> 00:04:50.160
<v Scott>is now cheap what are the things beyond the cheap code that now make an app great and i think we're

00:04:50.160 --> 00:04:55.020
<v Scott>going to start to see what companies come to the forefront for that so i'd say it's like a five on

00:04:55.050 --> 00:05:01.639
<v Scott>the sense it's it's very realistic but like when it comes to like we didn't go on the next meter

00:05:01.660 --> 00:05:02.840
<v Scott>here, but I think this is

00:05:03.780 --> 00:05:05.600
<v Scott>I think this is like an up

00:05:05.680 --> 00:05:07.460
<v Scott>there take like it. What are the

00:05:07.680 --> 00:05:09.380
<v Scott>options? I think Diablo is the hottest

00:05:09.780 --> 00:05:11.620
<v Scott>take, right? Or do

00:05:11.620 --> 00:05:13.700
<v Scott>we have to hold off on the packets? I think it's a fire

00:05:13.900 --> 00:05:15.300
<v Scott>take. So it would be Diablo

00:05:16.000 --> 00:05:17.740
<v Scott>fire, hot

00:05:18.460 --> 00:05:18.800
<v Scott>mild

00:05:19.919 --> 00:05:21.780
<v Scott>Verde, right? Something like that.

00:05:22.140 --> 00:05:23.700
<v Scott>Anyway, I think we're in a fire

00:05:23.880 --> 00:05:25.040
<v Scott>take zone. Like this is a

00:05:26.320 --> 00:05:27.640
<v Scott>good strong take here.

00:05:28.420 --> 00:05:29.620
<v Scott>And I do think it'll happen.

00:05:29.970 --> 00:05:31.620
<v Scott>I just think there's going to be

00:05:31.640 --> 00:05:33.600
<v Scott>I mean, if someone's super successful with it, that'd be great.

00:05:33.800 --> 00:05:37.820
<v Scott>But I think there might be repercussions for some who use AI in a lazy way.

00:05:39.540 --> 00:05:44.080
<v Dillon>Yeah, it's more of what I've seen more recently is there's just more and more MCPs.

00:05:44.190 --> 00:05:48.840
<v Dillon>And then there's like more and more of using agents to invoke other agents.

00:05:49.180 --> 00:05:56.920
<v Dillon>And it's just becoming this like, I don't know, storm of like agents will be able to control agents.

00:05:57.520 --> 00:06:02.860
<v Dillon>And eventually it'll be able to like autonomously drive itself in a sense.

00:06:04.160 --> 00:06:07.240
<v Scott>Well, I think what makes the take fire, I actually think that is the future.

00:06:07.940 --> 00:06:10.140
<v Scott>Now, how long it takes to get there is hard to predict.

00:06:11.900 --> 00:06:17.100
<v Scott>But I think what makes the take to me like fire is you're saying like we don't really might not need engineers this fast.

00:06:17.500 --> 00:06:29.880
<v Scott>I think personally that there's probably always going to be at least one engineer that has to now orchestrate all the agents together and understand it.

00:06:30.000 --> 00:06:32.320
<v Scott>It seems like you think like that might become.

00:06:32.740 --> 00:06:37.240
<v Scott>Yeah, that's why I think your take is fire, because you're saying like, oh, we won't even need that.

00:06:37.380 --> 00:06:42.360
<v Scott>I think that the engineer is going to need to be around for longer than longer than this year.

00:06:42.660 --> 00:06:44.100
<v Dillon>But yeah, I don't know.

00:06:44.400 --> 00:06:44.480
<v Dillon>Yes.

00:06:44.680 --> 00:06:44.760
<v Dillon>Yeah.

00:06:44.980 --> 00:06:50.260
<v Dillon>If it happens this year, it'll be surprising, but I think it's possible because I'm starting to go away from it.

00:06:50.780 --> 00:06:50.980
<v Scott>Lean in.

00:06:51.050 --> 00:06:58.940
<v Dillon>Yeah, I've seen where they're creating sandboxes where you can just like create 10 different versions of your feature in a sandbox and then you can review it.

00:06:59.310 --> 00:07:02.420
<v Dillon>I feel like a product person can just review it and approve it and just move on.

00:07:03.460 --> 00:07:05.480
<v Dillon>And like, meet us.

00:07:06.620 --> 00:07:07.940
<v Scott>It kind of reminds me of a cursor.

00:07:08.200 --> 00:07:10.540
<v Scott>Cursor is now like the designer's dream

00:07:10.990 --> 00:07:13.380
<v Scott>because like cursor can just basically do everything

00:07:13.600 --> 00:07:14.940
<v Scott>and the designer can design it.

00:07:15.460 --> 00:07:17.380
<v Scott>And it's like, we don't even need you, engineer.

00:07:18.600 --> 00:07:20.360
<v Matt>I shared this, there was this tweet

00:07:20.820 --> 00:07:21.480
<v Matt>that I saw the other day

00:07:21.480 --> 00:07:23.640
<v Matt>and I shared it with one of our Discord servers.

00:07:23.960 --> 00:07:28.060
<v Matt>But this person's like, so for those who don't know,

00:07:28.120 --> 00:07:29.720
<v Matt>there's this thing called Gastown,

00:07:30.020 --> 00:07:33.460
<v Matt>which is a orchestrator for AI agents.

00:07:34.600 --> 00:07:36.420
<v Matt>You have like multiple tiers of like a,

00:07:36.700 --> 00:07:39.980
<v Matt>You have a mayor that manages pole cats,

00:07:40.900 --> 00:07:42.120
<v Matt>which do the work.

00:07:42.280 --> 00:07:43.300
<v Matt>And then there's deacons that,

00:07:43.660 --> 00:07:44.000
<v Matt>I don't know.

00:07:44.140 --> 00:07:44.680
<v Matt>There's a lot of characters,

00:07:45.260 --> 00:07:46.460
<v Matt>but gas town,

00:07:46.740 --> 00:07:47.040
<v Matt>this thing,

00:07:47.380 --> 00:07:48.620
<v Matt>it's like an orchestrator for multiple agents.

00:07:49.220 --> 00:07:49.740
<v Matt>And this tweet,

00:07:50.000 --> 00:07:50.560
<v Matt>this guy's like,

00:07:50.960 --> 00:07:51.140
<v Matt>well,

00:07:51.140 --> 00:07:53.520
<v Matt>I started sparing up multiple gas towns to work on it.

00:07:53.980 --> 00:07:54.200
<v Matt>It's like,

00:07:54.300 --> 00:07:55.340
<v Matt>there's multiple town instances.

00:07:55.660 --> 00:07:57.880
<v Matt>Now the towns are competing in terms of work.

00:07:57.940 --> 00:07:58.380
<v Matt>And so now,

00:07:59.000 --> 00:08:02.160
<v Matt>now there's soldier agents that go and fight their towns.

00:08:06.819 --> 00:08:14.180
<v Matt>i can just imagine that like scaling out to gas countries and gas worlds uh by the end of 2026

00:08:16.020 --> 00:08:16.020
<v Matt>and there's no role for us anyway um my

00:08:16.021 --> 00:08:16.021
<v Scott>very possible

00:08:16.022 --> 00:08:16.022
<v Matt>my my spice uh

00:08:26.440 --> 00:08:32.919
<v Matt>or my spice rating for Dillon's first prediction uh i think it's in between fire and verde i think

00:08:33.419 --> 00:08:33.419
<v Matt>do we have is there a spice there i'm trying to remember the scale um

00:08:33.420 --> 00:08:33.420
<v Scott>verde is the lowest right

00:08:38.680 --> 00:08:38.680
<v Scott>i think the scale i'm sorry i'm gonna try diablo fire hot mild verde

00:08:38.681 --> 00:08:38.681
<v Matt>okay i think we should also

00:08:50.840 --> 00:08:52.720
<v Matt>throw in a white bread or a mayo

00:08:53.580 --> 00:08:54.080
<v Matt>below Verde.

00:08:55.820 --> 00:08:57.160
<v Scott>Not even on the scale.

00:08:57.300 --> 00:08:57.420
<v Matt>Okay.

00:08:59.120 --> 00:08:59.600
<v Matt>Yeah.

00:08:59.600 --> 00:09:01.100
<v Matt>I guess it's a hot take

00:09:01.140 --> 00:09:02.240
<v Matt>in my opinion.

00:09:05.000 --> 00:09:06.120
<v Matt>I don't think we're going to hit

00:09:06.260 --> 00:09:08.740
<v Matt>in the same way that I think some orgs will do it

00:09:08.740 --> 00:09:09.360
<v Matt>and some won't.

00:09:09.740 --> 00:09:12.360
<v Matt>I think from what I've been seeing

00:09:12.540 --> 00:09:14.420
<v Matt>in terms of AI development

00:09:14.520 --> 00:09:16.720
<v Matt>in the past six months, past year, past two years,

00:09:17.300 --> 00:09:18.900
<v Matt>I could see this

00:09:19.680 --> 00:09:20.060
<v Matt>taking off.

00:09:21.140 --> 00:09:22.940
<v Scott>Dillon's saying like engineers won't be needed.

00:09:23.210 --> 00:09:25.380
<v Scott>I don't know if anyone's going to go that far this year.

00:09:25.570 --> 00:09:28.360
<v Scott>I think that is a really, that's what makes it so far.

00:09:30.180 --> 00:09:30.360
<v Matt>All right.

00:09:30.390 --> 00:09:34.460
<v Matt>So I think we then also need to review, Scott, you shared your first prediction.

00:09:35.500 --> 00:09:37.100
<v Matt>Can you reiterate it for the listener?

00:09:37.490 --> 00:09:38.920
<v Matt>And then we can give a review of it.

00:09:39.060 --> 00:09:39.980
<v Scott>I'm not going to count that.

00:09:40.040 --> 00:09:40.900
<v Scott>I'm just going to use that.

00:09:41.060 --> 00:09:41.960
<v Scott>So I have a few predictions.

00:09:42.270 --> 00:09:44.920
<v Scott>We can, I can just use that as some of my logic.

00:09:45.330 --> 00:09:49.519
<v Scott>I do want to just like maybe cover it one quicker time that, that there's going to be some sort

00:09:49.540 --> 00:09:53.620
<v Scott>of reward or merit for companies that care about avoiding enshittification.

00:09:54.720 --> 00:09:57.820
<v Scott>And we'll see those companies emerge in 2026.

00:09:58.380 --> 00:10:05.520
<v Scott>Maybe not like super hot take, but basically we're going to see leaders in how to use AI

00:10:05.820 --> 00:10:06.000
<v Scott>correctly.

00:10:06.080 --> 00:10:07.520
<v Scott>And that's going to actually emerge.

00:10:07.760 --> 00:10:13.180
<v Scott>Like they're going to become the gold standard for what should an engineering team do with

00:10:13.240 --> 00:10:14.700
<v Scott>AI and how should it be used?

00:10:15.160 --> 00:10:16.640
<v Scott>Similar to, I guess, Gastown.

00:10:17.160 --> 00:10:23.880
<v Scott>But hopefully it's not just like these Ralph Wiggum Gastown solutions for agents.

00:10:24.280 --> 00:10:26.320
<v Matt>Haven't you heard that Ralph Wiggum is out?

00:10:26.720 --> 00:10:28.340
<v Matt>The Cloud Code shipped tasks?

00:10:28.760 --> 00:10:29.580
<v Scott>It's already out.

00:10:30.360 --> 00:10:30.880
<v Matt>And it's out.

00:10:30.980 --> 00:10:32.700
<v Matt>Yeah, it's been around for two weeks and it's gone.

00:10:34.140 --> 00:10:38.680
<v Matt>Boomers are using Ralph Wiggum loops, but all the Gen Zs have moved on to,

00:10:38.960 --> 00:10:41.060
<v Matt>or Gen Alphas have moved on to something better.

00:10:41.600 --> 00:10:44.060
<v Scott>Don't hate me, but I never used a Ralph Wiggum loop.

00:10:44.180 --> 00:10:48.180
<v Scott>I mean, I use many agents, but not a Ralph Wiggum loop.

00:10:48.480 --> 00:10:49.100
<v Scott>I don't know.

00:10:49.940 --> 00:10:51.960
<v Scott>I'm still relevant in 2026.

00:10:52.600 --> 00:10:53.760
<v Scott>That's what I'm trying to do.

00:10:54.740 --> 00:10:56.040
<v Scott>Okay, so my prediction.

00:10:57.100 --> 00:10:58.900
<v Scott>My prediction, first prediction for 2026.

00:11:00.100 --> 00:11:04.400
<v Scott>AI in interviews for engineers will become the norm.

00:11:05.230 --> 00:11:10.619
<v Scott>And that means that engineers will be using AI in their interview process

00:11:11.220 --> 00:11:14.440
<v Scott>in the coding round as the norm.

00:11:16.260 --> 00:11:17.780
<v Scott>I guess I could lean in on this.

00:11:19.080 --> 00:11:22.660
<v Scott>It's pretty wild to me that you ask somebody

00:11:22.970 --> 00:11:24.960
<v Scott>how to use something that makes your job easier,

00:11:25.660 --> 00:11:27.760
<v Scott>but then you would interview them without it.

00:11:28.050 --> 00:11:30.580
<v Scott>But not only you ask them how to use it,

00:11:30.850 --> 00:11:33.040
<v Scott>to use it, you force it down their throats

00:11:33.210 --> 00:11:34.400
<v Scott>in a lot of these companies.

00:11:34.620 --> 00:11:35.580
<v Scott>A lot of these companies are like,

00:11:35.990 --> 00:11:38.040
<v Scott>you know, tokens, we don't care about tokens.

00:11:38.340 --> 00:11:39.380
<v Scott>Burn them up, burn them up.

00:11:39.520 --> 00:11:41.840
<v Scott>You don't want to do any coding anymore.

00:11:42.020 --> 00:11:43.240
<v Scott>Just burn them tokens up.

00:11:43.580 --> 00:11:48.760
<v Scott>So you would rather spend more money using AI to solve problems,

00:11:49.480 --> 00:11:51.460
<v Scott>but hiring people who are smart enough to do something.

00:11:51.580 --> 00:11:53.120
<v Scott>Well, if it's actually part of the workflow,

00:11:53.160 --> 00:11:56.060
<v Scott>we're going to see reasonable companies start to understand this

00:11:56.520 --> 00:11:59.280
<v Scott>and try to find and learn from how people use AI

00:11:59.880 --> 00:12:02.560
<v Scott>to try to come up with some of these emerging behaviors

00:12:02.920 --> 00:12:06.300
<v Scott>that companies might start to see and trends we might start to see.

00:12:07.080 --> 00:12:08.900
<v Scott>Like Ralph Wiggum and Gastown.

00:12:09.780 --> 00:12:14.700
<v Matt>any immediate reactions Dillon or do you want to jump into your uh you know your rating of how

00:12:14.820 --> 00:12:14.820
<v Matt>likely

00:12:14.821 --> 00:12:14.821
<v Dillon>the place i work at is already telling people to use ai during interviews like that's

00:12:19.860 --> 00:12:27.480
<v Dillon>a part of our prompt which is weird but the reason i say it's weird is i don't think our questions

00:12:28.000 --> 00:12:33.839
<v Dillon>are written in a way that makes them work well with ai and i think that's going to be the biggest

00:12:33.860 --> 00:12:42.340
<v Dillon>problem with asking people to use ai if we don't rewrite our problems to to work well with ai then

00:12:42.860 --> 00:12:48.200
<v Dillon>we're just going to get really really bad candidates but yeah i think this is very likely to be true

00:12:48.250 --> 00:12:48.250
<v Dillon>because i'm already seeing it and then the spice level is what's above mild hot

00:12:48.251 --> 00:12:48.251
<v Matt>it was uh diablo

00:12:57.740 --> 00:12:59.920
<v Matt>low fire, hot, mild, verde mayo.

00:13:00.460 --> 00:13:02.040
<v Dillon>It's in between fire and hot for me.

00:13:03.220 --> 00:13:03.880
<v Scott>I think that's fair.

00:13:04.560 --> 00:13:05.420
<v Scott>It's not a hot take.

00:13:05.630 --> 00:13:07.040
<v Scott>I mean, I want to just call out too.

00:13:07.220 --> 00:13:08.460
<v Scott>Yeah, we did this a total bit as well.

00:13:08.850 --> 00:13:12.140
<v Scott>And I think startups usually catch on to some of these trends faster

00:13:12.370 --> 00:13:15.280
<v Scott>because, you know, they just need people who can move quickly.

00:13:15.430 --> 00:13:17.000
<v Scott>So this is a great way to move faster.

00:13:17.820 --> 00:13:21.480
<v Scott>So smaller companies are probably quicker to take on this.

00:13:22.380 --> 00:13:23.460
<v Scott>I guess I was going to call it a risk.

00:13:23.530 --> 00:13:26.540
<v Scott>I wouldn't say necessarily it is a risk,

00:13:26.780 --> 00:13:31.160
<v Scott>But I meant to also just elevate that like bigger corporate companies, right?

00:13:31.280 --> 00:13:34.800
<v Scott>Like we're going to see Meta, we're going to see Google, we're going to see Amazon starting

00:13:34.920 --> 00:13:35.460
<v Scott>to allow it.

00:13:35.880 --> 00:13:37.360
<v Scott>So view it through that lens.

00:13:38.160 --> 00:13:41.800
<v Matt>Yeah, my prediction in terms of how likely this is true.

00:13:41.990 --> 00:13:43.160
<v Matt>Yeah, I think nine out of 10.

00:13:43.230 --> 00:13:47.840
<v Matt>I think it's, you know, I'm sure there's already companies doing this, but I think it's, I

00:13:47.850 --> 00:13:51.640
<v Matt>think we're going to see a lot of companies that maybe took the hard stance a year ago,

00:13:51.880 --> 00:13:58.820
<v Matt>six months ago two years ago to say like no ai use at all during the interview to lean into it

00:13:59.140 --> 00:14:03.960
<v Matt>and instead like sort of flip that decision a little bit um and then in terms of like the

00:14:04.080 --> 00:14:08.160
<v Matt>spiciness yeah i think it's i think it's actually pretty lukewarm i think i think probably a fair

00:14:08.160 --> 00:14:13.180
<v Matt>verde for me is like i think it's there's going to be people that are going to react spicy like

00:14:13.480 --> 00:14:18.959
<v Matt>obviously are going to have hot takes on it but i think just like i think it's bound to happen

00:14:19.540 --> 00:14:23.940
<v Matt>it's like it's like telling someone it's like no you can't write your uh you know you can't use

00:14:24.100 --> 00:14:29.320
<v Matt>javascript in your interview you have to use c or c plus plus it's like no that that's silly

00:14:30.220 --> 00:14:35.320
<v Scott>cool i totally think that's fair matt why don't you give us your first prediction of 2026

00:14:36.600 --> 00:14:43.220
<v Matt>all right let's see um i guess i'll start with this one i think this is maybe actually kind of

00:14:43.420 --> 00:14:48.940
<v Matt>counter to what our the year is first predictions were but i think companies will actually start to

00:14:48.960 --> 00:14:54.460
<v Matt>back on their AI budgets. I think they've been throwing a lot of money at it over the past year

00:14:54.560 --> 00:14:59.980
<v Matt>and a half year, two years, whatever it is. Um, and I think we're going to start to see companies

00:15:00.120 --> 00:15:04.900
<v Matt>realize that actually, no, we don't need to pay for, you know, both a Claude subscription for every

00:15:05.100 --> 00:15:09.680
<v Matt>seat and a cursor subscription for every seat and a GitHub copilot subscription for every seat

00:15:10.360 --> 00:15:15.340
<v Matt>and chat, like all of these different services that overlap quite a bit. I think we're going to

00:15:15.240 --> 00:15:21.340
<v Matt>start to see these companies start to say, actually, we don't want to spend that much

00:15:21.440 --> 00:15:21.920
<v Matt>money on AI.

00:15:23.120 --> 00:15:28.860
<v Matt>Let's try to centralize our expenses a little bit and try not to just, I don't know, we're

00:15:28.920 --> 00:15:34.000
<v Matt>not in a zero interest rate sort of economy anymore.

00:15:34.200 --> 00:15:36.300
<v Matt>So I think that's definitely going to put additional pressure there.

00:15:38.259 --> 00:15:43.420
<v Dillon>Do you think that will reduce how much AI we're using or we're just going to pick a winner

00:15:43.860 --> 00:15:46.140
<v Dillon>for a company that we're using AI with?

00:15:47.220 --> 00:15:49.320
<v Matt>I think it could be a little bit of a mix of both.

00:15:49.390 --> 00:15:51.880
<v Matt>I think you could see some companies saying,

00:15:52.210 --> 00:15:55.520
<v Matt>okay, we've been paying for maybe in the coding agent space,

00:15:55.680 --> 00:15:58.120
<v Matt>we've been paying for Claude and for ChatGPT, for example,

00:15:58.300 --> 00:16:01.000
<v Matt>or OpenAI's offerings.

00:16:01.720 --> 00:16:03.040
<v Matt>I think we're going to see some companies say,

00:16:03.120 --> 00:16:04.500
<v Matt>actually, we don't need to be paying for both,

00:16:04.530 --> 00:16:06.300
<v Matt>and we're just going to go in on one.

00:16:07.440 --> 00:16:10.360
<v Matt>But I think also you could see a little bit of,

00:16:11.860 --> 00:16:15.840
<v Matt>Yeah, so that's sort of condensing down into what provider you use.

00:16:16.110 --> 00:16:17.360
<v Matt>I think we'll also just generally see,

00:16:18.150 --> 00:16:20.600
<v Matt>actually we don't need to be paying for, I don't know,

00:16:21.250 --> 00:16:24.180
<v Matt>10,000 seats for this application

00:16:24.410 --> 00:16:26.440
<v Matt>when only 5,000 people are actively using it.

00:16:28.040 --> 00:16:29.440
<v Matt>Again, random numbers pulled out of a hat.

00:16:29.630 --> 00:16:30.960
<v Matt>But I think we'll start to say,

00:16:31.540 --> 00:16:35.080
<v Matt>be a little bit more careful about just throwing money at AI.

00:16:36.080 --> 00:16:36.680
<v Scott>Wow, okay.

00:16:37.420 --> 00:16:39.600
<v Scott>Dillon, do you have any more reaction before I go ahead?

00:16:40.000 --> 00:16:41.420
<v Scott>You also need to rank it.

00:16:42.040 --> 00:16:42.540
<v Dillon>you go first

00:16:44.160 --> 00:16:50.740
<v Scott>okay so this one is very similar to another one of mine um it's basically the same thing um

00:16:51.500 --> 00:16:55.120
<v Scott>and the reason why i picked it is i feel like it's slightly a double-edged sword

00:16:55.600 --> 00:16:59.000
<v Scott>i think you're right in the sense that like some companies will cut back we know

00:16:59.700 --> 00:17:05.780
<v Scott>at a former company we worked at they don't have a lot of tokens which is not fun uh but anyway

00:17:06.449 --> 00:17:11.400
<v Scott>i think like the magnificent seven companies since they're like really been propping up the

00:17:11.420 --> 00:17:16.760
<v Scott>industry they'll continue really big companies will continue to burn tons and tons of money on

00:17:16.850 --> 00:17:21.819
<v Scott>ai because they see it as a money maker however i do agree with you in the sense that maybe some

00:17:22.420 --> 00:17:28.760
<v Scott>other companies would cut back i guess my take was more of like people will start to think ai

00:17:29.140 --> 00:17:35.800
<v Scott>is super helpful so ai prices will rise and will care more about what you burn tokens on

00:17:36.320 --> 00:17:38.500
<v Scott>I don't want to use it anymore because it's very similar.

00:17:40.620 --> 00:17:42.960
<v Scott>But again, I think it's like a double-edged sword.

00:17:43.090 --> 00:17:55.380
<v Scott>I think that, yes, some companies might start to scale back to see where they can make savings because it is crazy how we're throwing money at the AI bubble, I guess, right?

00:17:56.980 --> 00:18:01.260
<v Scott>So thoughts are mainly, I think this is on a scale of 1 to 10.

00:18:02.040 --> 00:18:05.780
<v Scott>I think this is very likely a five or a six again.

00:18:06.220 --> 00:18:07.120
<v Scott>Maybe we'll just say a six.

00:18:08.520 --> 00:18:16.100
<v Scott>But it's a double-edged sword in the sense that basically companies that have the money are going to continue to just throw money at the wall.

00:18:17.720 --> 00:18:24.060
<v Scott>And companies that are more budgeted that can't really afford a black hole won't be able to handle that.

00:18:24.580 --> 00:18:29.600
<v Scott>So on the take scale, I'm going to give this a mild.

00:18:30.080 --> 00:18:31.620
<v Scott>I'm going to say that's better than it sounds.

00:18:32.300 --> 00:18:38.440
<v Scott>it's not a white bread or a or a verde take it's mild but i i think it's reasonable and logical

00:18:38.920 --> 00:18:38.920
<v Scott>for these companies

00:18:38.921 --> 00:18:38.921
<v Dillon>um i think this is tomato sauce level of spice

00:18:46.320 --> 00:18:48.960
<v Matt>in that in that you think it is pretty likely to happen or

00:18:49.720 --> 00:18:49.720
<v Dillon>just not likely to happen and it's not very spicy

00:18:49.721 --> 00:18:49.721
<v Matt>oh that's surprising to me

00:18:56.600 --> 00:19:01.800
<v Dillon>yeah the read read like re give me your take and let me give you like a good assessment of it

00:19:01.820 --> 00:19:01.820
<v Dillon>quick

00:19:01.821 --> 00:19:01.821
<v Matt>uh my take is just that companies will start to scale back on their expenses for ai i'm not

00:19:09.940 --> 00:19:15.300
<v Matt>necessarily saying that they're going to flip in the sense that they start to start to like not

00:19:16.000 --> 00:19:16.000
<v Matt>spend money on it but just like spend less

00:19:16.001 --> 00:19:16.001
<v Dillon>i still think this is the year i i feel like the ai train

00:19:22.850 --> 00:19:28.840
<v Dillon>is going to continue to move at full speed ahead for the entire year i feel like there's just so

00:19:28.860 --> 00:19:36.360
<v Dillon>many new things emerging out of it right now um that it's just not going to slow down and maybe

00:19:36.460 --> 00:19:41.340
<v Dillon>that's just where i am but place i'm at the investment is huge and i don't think it's going

00:19:41.520 --> 00:19:41.520
<v Dillon>anywhere anytime soon

00:19:41.521 --> 00:19:41.521
<v Matt>so you're also you're also kind of just like maybe wishful thinking or hopeful

00:19:46.180 --> 00:19:50.980
<v Matt>thinking that uh that it does that my prediction is false and so your company keeps spending more

00:19:51.000 --> 00:19:51.000
<v Matt>and you get to keep using those keep consuming those tokens

00:19:51.001 --> 00:19:51.001
<v Dillon>i just don't think it's going anywhere

00:19:57.860 --> 00:20:01.040
<v Dillon>Our company just started the AI coalition of Massachusetts.

00:20:03.400 --> 00:20:06.120
<v Dillon>They're trying to tie it into politics or something at this point.

00:20:06.899 --> 00:20:08.080
<v Dillon>They're not slowing it down.

00:20:10.520 --> 00:20:12.040
<v Dillon>I'm going to have to put that in the show notes.

00:20:13.700 --> 00:20:16.900
<v Scott>Who's up next? I think we make Dillon do it next again.

00:20:17.140 --> 00:20:18.620
<v Matt>Dillon, what's your second prediction?

00:20:19.600 --> 00:20:24.800
<v Dillon>I'm going to try to steer it away from AI since we all gave an AI take in 2026.

00:20:26.340 --> 00:20:27.860
<v Dillon>and maybe give you

00:20:27.980 --> 00:20:29.740
<v Dillon>a sillier take

00:20:30.140 --> 00:20:30.580
<v Dillon>this time around.

00:20:32.920 --> 00:20:34.120
<v Dillon>One of us is going to get

00:20:34.120 --> 00:20:34.620
<v Dillon>a new job

00:20:36.380 --> 00:20:38.020
<v Dillon>and one of us is going to get laid off.

00:20:38.340 --> 00:20:39.820
<v Scott>Dude, that's literally one of my takes.

00:20:40.560 --> 00:20:41.240
<v Matt>Are we talking about

00:20:42.090 --> 00:20:44.060
<v Matt>the listeners included or just the three

00:20:44.070 --> 00:20:44.070
<v Matt>of us?

00:20:44.071 --> 00:20:44.071
<v Dillon>Just us in 2026.

00:20:47.060 --> 00:20:48.080
<v Matt>And it could be the same person.

00:20:48.460 --> 00:20:48.460
<v Scott>Just us.

00:20:48.461 --> 00:20:48.461
<v Dillon>It could be the same person.

00:20:50.080 --> 00:20:52.460
<v Scott>It could be the same person.

00:20:56.980 --> 00:21:01.700
<v Scott>i think i'm out of takes i thought i came with so many takes every take you guys have had i've

00:21:01.700 --> 00:21:01.700
<v Scott>had the same take

00:21:01.701 --> 00:21:01.701
<v Matt>so Scott and it's it sounds like you're it sounds like you're 15 out of 10

00:21:06.260 --> 00:21:06.260
<v Matt>on how likely this prediction is going to be true then

00:21:06.261 --> 00:21:06.261
<v Scott>one of my takes was one of us will get laid

00:21:12.420 --> 00:21:18.180
<v Scott>off i didn't even predict we'd find a job i didn't even get that far so Dillon's done a lot more work

00:21:18.200 --> 00:21:18.200
<v Scott>there now

00:21:18.201 --> 00:21:18.201
<v Dillon>that's okay

00:21:18.202 --> 00:21:18.202
<v Scott>damn i i should we go back to the ai takes or is this is this what we should

00:21:26.720 --> 00:21:34.360
<v Matt>all right so you're 10 out of 10 on how likely and your spice meter sounds like maybe mayo or like

00:21:34.720 --> 00:21:34.720
<v Matt>like in the sense that you're completely aligned with Dillon's take

00:21:34.721 --> 00:21:34.721
<v Dillon>wait which part someone will

00:21:40.180 --> 00:21:40.180
<v Dillon>get laid off

00:21:40.181 --> 00:21:40.181
<v Scott>yeah i actually i'm gonna be honest i i was trying to be a little more spicy with it

00:21:47.160 --> 00:21:53.840
<v Scott>I don't actually think at three companies we work at, one of us is really going to get laid off.

00:21:54.060 --> 00:21:56.200
<v Scott>So I think it is a little spicy if we do.

00:21:56.560 --> 00:22:03.220
<v Scott>And that would be more along the line of a prediction of, oh, AI is so great and it's so awesome and we don't care about the budget.

00:22:03.360 --> 00:22:06.420
<v Scott>So let's just take less engineers is actually like what happens.

00:22:07.480 --> 00:22:10.100
<v Scott>I think some companies will do that.

00:22:10.560 --> 00:22:13.440
<v Scott>And I think it's not going to go as well as they think.

00:22:13.500 --> 00:22:20.040
<v Scott>maybe maybe i'm pauliannish one would say about that um i know like twitter kind of did this right

00:22:20.040 --> 00:22:25.520
<v Scott>in what 2021 with when elon musk came in and the app like it is definitely worse but it's still

00:22:25.700 --> 00:22:31.540
<v Scott>serviceable and it's kind of ushered in the whole enshittification of apps i have a feeling that

00:22:31.540 --> 00:22:37.000
<v Scott>we're going to get closer to a place where like some of these apps can't get that shitty i mean

00:22:37.420 --> 00:22:43.500
<v Scott>maybe not amazon and these large conglomerates but if smaller businesses take bigger risks

00:22:44.120 --> 00:22:48.060
<v Scott>on making their app allowing their apps to be worse just to save a little bit of money

00:22:48.580 --> 00:22:50.480
<v Scott>i do think we'll start to see more competition

00:22:52.140 --> 00:22:58.040
<v Matt>gotcha what's your spice spice rating your spice-o-meter rating trademark

00:23:00.440 --> 00:23:06.020
<v Scott>um i'm gonna say it's hot uh what's no i'm gonna give it i'm gonna give them another fire because

00:23:06.800 --> 00:23:08.060
<v Scott>if one of us gets laid off

00:23:08.980 --> 00:23:09.780
<v Dillon>I'll just be

00:23:10.240 --> 00:23:10.960
<v Dillon>I'm going to be surprised

00:23:11.270 --> 00:23:12.700
<v Matt>the first person I'm going to blame is Dillon

00:23:13.350 --> 00:23:14.000
<v Scott>he gets a fire

00:23:15.760 --> 00:23:16.800
<v Dillon>I'm the most likely

00:23:17.520 --> 00:23:18.240
<v Dillon>I'm getting a rough

00:23:18.520 --> 00:23:19.680
<v Dillon>because I came up with a take

00:23:21.340 --> 00:23:23.280
<v Dillon>I feel like it's going to be something

00:23:23.350 --> 00:23:25.320
<v Dillon>where I used AI too much at work

00:23:25.540 --> 00:23:27.600
<v Dillon>and the quality of my deliverables just goes to shit

00:23:29.640 --> 00:23:31.600
<v Dillon>I mean I might be in that bucket

00:23:33.260 --> 00:23:33.960
<v Scott>just say

00:23:34.470 --> 00:23:36.000
<v Scott>I thought you said that's what you wanted

00:23:36.020 --> 00:23:36.020
<v Scott>isn't that what they said they wanted

00:23:36.021 --> 00:23:36.021
<v Matt>and then they'll get the cue clear like did you say fired

00:23:43.220 --> 00:23:49.980
<v Matt>or laid off like just lose our job it doesn't matter if it's part of like a mess or just like

00:23:50.570 --> 00:23:50.570
<v Matt>matters it doesn't matter

00:23:50.571 --> 00:23:50.571
<v Scott>uh no he said laid off it's got to be laid off you can't act actively

00:23:57.860 --> 00:23:57.860
<v Scott>act a fool and get fired that won't come

00:23:57.861 --> 00:23:57.861
<v Matt>well i'm not trying to i'm not trying to make his prediction

00:24:01.980 --> 00:24:01.980
<v Matt>true or at least not half of it

00:24:01.981 --> 00:24:01.981
<v Scott>who knows maybe you are

00:24:01.982 --> 00:24:01.982
<v Matt>uh i think i think it's an 8 out of 10

00:24:09.000 --> 00:24:09.000
<v Matt>for me i think i can see this being true sadly well

00:24:09.001 --> 00:24:09.001
<v Scott>i didn't you didn't let me pick a number

00:24:15.880 --> 00:24:15.880
<v Scott>rating i'm i'm going with a three i'm gonna be

00:24:15.881 --> 00:24:15.881
<v Matt>a three one i thought you said a 10 out of 10

00:24:21.920 --> 00:24:28.220
<v Scott>no no no i called it a fire take and i'm only giving it a three i don't think it's happening

00:24:28.220 --> 00:24:28.220
<v Scott>this year

00:24:28.221 --> 00:24:28.221
<v Matt>wow

00:24:28.222 --> 00:24:28.222
<v Scott>too soon

00:24:28.223 --> 00:24:28.223
<v Matt>all right uh and then i think i'm gonna give this a diablo take i feel

00:24:34.240 --> 00:24:34.240
<v Matt>like this is

00:24:34.241 --> 00:24:34.241
<v Scott>oh

00:24:34.242 --> 00:24:34.242
<v Matt>i think it's a great i think it's a great like out of like i wasn't expecting this

00:24:40.680 --> 00:24:45.940
<v Matt>prediction is basically what i'm saying uh i'm not saying it's great i hope it's not true again but

00:24:47.840 --> 00:24:47.840
<v Scott>it's hard for me to not say i wasn't expecting it because i was going to make it

00:24:47.841 --> 00:24:47.841
<v Matt>yeah that's true

00:24:53.640 --> 00:24:53.640
<v Matt>all right scott uh what's your second prediction

00:24:53.641 --> 00:24:53.641
<v Scott>oh man i think i'm like out of predictions uh hold

00:24:59.740 --> 00:25:05.680
<v Scott>on all right all right well i'm out of i'm out of good predictions now but we'll use this one

00:25:06.640 --> 00:25:06.640
<v Scott>the ai bubble will burst it's gonna burst

00:25:06.641 --> 00:25:06.641
<v Matt>this is kind of like a maybe a an even deeper take than

00:25:16.540 --> 00:25:19.960
<v Matt>what i just shared before of like where i think it's where i think companies are going to scale

00:25:19.980 --> 00:25:21.820
<v Matt>down a little bit. I think you're saying it's

00:25:22.040 --> 00:25:23.180
<v Matt>basically going to pop and

00:25:23.760 --> 00:25:25.340
<v Matt>everything goes into the shithole.

00:25:26.260 --> 00:25:27.860
<v Matt>Everything's going to go into the shithole.

00:25:28.080 --> 00:25:29.900
<v Scott>The stock market is going

00:25:29.920 --> 00:25:30.740
<v Scott>to crash.

00:25:31.820 --> 00:25:33.620
<v Scott>There's going to be no idea of recovery.

00:25:34.100 --> 00:25:35.940
<v Scott>Everyone's going to be like, should we even be using

00:25:36.140 --> 00:25:36.720
<v Scott>Claude anymore?

00:25:38.160 --> 00:25:39.760
<v Scott>AI tokens are going to be cheaper

00:25:39.980 --> 00:25:39.980
<v Scott>than water.

00:25:39.981 --> 00:25:39.981
<v Dillon>You get Diablo

00:25:42.180 --> 00:25:43.940
<v Dillon>take for this one. Good job,

00:25:44.000 --> 00:25:44.000
<v Dillon>Scott.

00:25:44.001 --> 00:25:44.001
<v Scott>Thank you.

00:25:44.002 --> 00:25:44.002
<v Dillon>I feel like

00:25:45.980 --> 00:25:46.660
<v Dillon>it's highly unlikely.

00:25:48.120 --> 00:25:49.500
<v Dillon>I actually wouldn't mind it, though.

00:25:50.179 --> 00:25:50.179
<v Dillon>so we'll see

00:25:50.180 --> 00:25:50.180
<v Scott>yeah i wouldn't either

00:25:50.181 --> 00:25:50.181
<v Dillon>i don't know what the what would happen because of it

00:25:56.540 --> 00:26:00.020
<v Dillon>that's the thing that's i'm unclear on maybe you guys have predictions on that like what would

00:26:00.120 --> 00:26:00.120
<v Dillon>happen if the bubble did burst

00:26:00.121 --> 00:26:00.121
<v Scott>well your prediction might come true might all get laid off

00:26:05.400 --> 00:26:05.400
<v Scott>three times bonus

00:26:05.401 --> 00:26:05.401
<v Dillon>one person not all three

00:26:05.402 --> 00:26:05.402
<v Scott>but you get the three x bonus if we all get laid off

00:26:14.120 --> 00:26:19.520
<v Matt>yeah i don't think it's likely to happen i think from what i've seen so far i feel like this

00:26:19.540 --> 00:26:24.720
<v Matt>kind of like i don't know i feel like the government's gonna end up bailing companies out

00:26:25.060 --> 00:26:31.160
<v Matt>if it does pop because i think some of these things have become so foundational to the current

00:26:31.360 --> 00:26:36.980
<v Matt>economy that i think that if it pops it's going to like like a lot of other dominoes start to fall

00:26:37.500 --> 00:26:42.200
<v Matt>and so i think something is going to step in to stop that i'm not saying that's good i'm just saying

00:26:42.250 --> 00:26:46.960
<v Matt>i think that's going to like i think it's unlikely that happened but if it does happen something is

00:26:46.980 --> 00:26:53.860
<v Matt>going to be there to help be the circuit breaker of the economy collapsing. I think two out of 10

00:26:53.990 --> 00:26:57.360
<v Matt>in terms of likeliness. I do think it's a fire take, though. I like the take.

00:26:57.520 --> 00:27:00.280
<v Dillon>I'm not worried about the economy. What happens to us, Matt?

00:27:01.940 --> 00:27:05.920
<v Matt>We're the economy. We're part of the economy. Haven't you guys seen that South Park episode?

00:27:07.680 --> 00:27:10.580
<v Dillon>No, I haven't. Can you put the South Park episode in the show notes?

00:27:10.700 --> 00:27:17.580
<v Matt>yeah i'll find a rip of it somewhere online and thank you uh it's the margaritaville one where

00:27:17.630 --> 00:27:21.100
<v Matt>they they get a blender and then the economy goes to shit and he tries to return the blender

00:27:21.240 --> 00:27:25.760
<v Matt>anyway it's good it's worth watching even if you don't watch south park that episode i feel like

00:27:25.770 --> 00:27:30.820
<v Matt>it's good all right i think it's my turn for my second prediction unfortunately it's a little bit

00:27:31.080 --> 00:27:37.260
<v Matt>ai centric still um yeah i'm getting thumbs downs and some boos from the from the audience here

00:27:37.720 --> 00:27:42.080
<v Matt>my take is a majority of content so greater than 50 percent of content on social networks will be

00:27:42.400 --> 00:27:42.400
<v Matt>generated or enhanced with ai

00:27:42.401 --> 00:27:42.401
<v Scott>white milk

00:27:42.402 --> 00:27:42.402
<v Dillon>are they not already

00:27:42.403 --> 00:27:42.403
<v Scott>like it's already 50

00:27:53.200 --> 00:27:55.000
<v Scott>whole milk i don't know what was your thing mayo

00:27:57.419 --> 00:28:02.760
<v Matt>i think certain networks have been at this case right like i feel like facebook's hit that point

00:28:02.900 --> 00:28:03.940
<v Matt>Twitter's definitely hit that point.

00:28:04.450 --> 00:28:06.700
<v Matt>But I don't think, for example, TikTok,

00:28:06.970 --> 00:28:10.040
<v Matt>I don't think that is a majority AI generated right now.

00:28:10.760 --> 00:28:11.700
<v Matt>I think it's going that direction.

00:28:11.960 --> 00:28:14.600
<v Scott>I don't even want to rate this take

00:28:14.840 --> 00:28:20.180
<v Scott>because straight up, I just agree with it and I hate it.

00:28:21.580 --> 00:28:21.740
<v Dillon>Yeah.

00:28:24.060 --> 00:28:25.800
<v Scott>There's nothing we can do about it, right?

00:28:26.920 --> 00:28:29.880
<v Scott>I guess I could just not buy AI generated content.

00:28:30.980 --> 00:28:32.040
<v Scott>It's like my only power.

00:28:34.139 --> 00:28:34.139
<v Scott>not support it god damn i'm gonna give you a mild or a hot but

00:28:34.140 --> 00:28:34.140
<v Matt>not a white bread all right

00:28:42.600 --> 00:28:47.000
<v Scott>not giving you a white bread on it i i because i didn't think of it this time it wasn't one i

00:28:47.200 --> 00:28:54.100
<v Scott>wrote down um it's just it feels really really true like i don't know i just we're heading to

00:28:54.220 --> 00:28:58.800
<v Scott>that inevitable future and it sucks and i don't really want to talk about it anymore so let Dillon

00:28:58.820 --> 00:28:58.820
<v Scott>go

00:28:58.821 --> 00:28:58.821
<v Matt>yeah i think that well one side note is like i do think because of this we'll see a resurgence

00:29:06.760 --> 00:29:13.880
<v Matt>in like really close-knit social networking like apps so like you know like we in the past five

00:29:13.900 --> 00:29:17.740
<v Matt>years we've seen the move from people go to like facebook to group messages for example i feel like

00:29:17.740 --> 00:29:23.120
<v Matt>we're going to maybe go even further down that path and and get closer to like i don't know more

00:29:23.140 --> 00:29:25.040
<v Matt>tight-knit social things.

00:29:25.799 --> 00:29:30.080
<v Scott>So this is like very close to the take I was making earlier about like apps will

00:29:30.220 --> 00:29:31.680
<v Scott>like that actually like care,

00:29:32.480 --> 00:29:34.080
<v Scott>like we'll start to come to the forefront kind of,

00:29:34.280 --> 00:29:34.740
<v Scott>kind of take,

00:29:35.700 --> 00:29:38.340
<v Scott>I think we all kind of agree and what you're saying Matt,

00:29:38.440 --> 00:29:39.500
<v Scott>but we all kind of agree that like,

00:29:40.660 --> 00:29:44.220
<v Scott>there's going to be so much garbage out there that things that actually have

00:29:44.400 --> 00:29:46.700
<v Scott>value are going to start to rise to the top in some way.

00:29:46.840 --> 00:29:48.140
<v Scott>And I definitely agree with that.

00:29:48.240 --> 00:29:49.020
<v Scott>Definitely see that.

00:29:49.940 --> 00:29:50.640
<v Scott>I guess I,

00:29:50.920 --> 00:29:51.020
<v Scott>again,

00:29:51.100 --> 00:29:52.240
<v Scott>I'm optimistic that happens.

00:29:52.940 --> 00:30:03.540
<v Dillon>I think there's going to be a greater divide in terms of a group of people that become more and more antisocial because they're online terminally with AI-generated content.

00:30:04.640 --> 00:30:15.180
<v Dillon>And then there's going to be another side, which we're already starting to see emerge a little bit, which is the people that just cut out technology completely with these dumb phones.

00:30:16.540 --> 00:30:21.160
<v Dillon>I think there's going to be maybe a bit more people going in that route in the future.

00:30:21.240 --> 00:30:32.480
<v Dillon>I've already found that I've already stopped using almost all social media to the point where all I use is Reddit, but even Reddit is becoming like 50% AI generated content.

00:30:33.180 --> 00:30:36.680
<v Dillon>And then the comments are everyone complaining that everything's AI all the time.

00:30:38.320 --> 00:30:42.960
<v Scott>Well, Reddit, it's like 75% of ChatGPT responses, which blows my mind.

00:30:43.930 --> 00:30:43.930
<v Scott>But

00:30:43.931 --> 00:30:43.931
<v Dillon>yeah, maybe it's fine.

00:30:45.270 --> 00:30:47.320
<v Scott>I also just, it's funny.

00:30:47.420 --> 00:30:50.780
<v Scott>I just agree with you about like, I said this to Matt before.

00:30:52.400 --> 00:30:53.240
<v Scott>but I just feel like

00:30:54.340 --> 00:30:55.400
<v Scott>the older I get I guess

00:30:55.520 --> 00:30:57.440
<v Scott>or just something to do with being

00:30:57.490 --> 00:30:59.580
<v Scott>an engineer like I used to love like social

00:30:59.650 --> 00:31:01.180
<v Scott>media when I was younger maybe it was just age

00:31:01.760 --> 00:31:03.440
<v Scott>I like despise it

00:31:03.610 --> 00:31:05.300
<v Scott>I don't care for it

00:31:05.330 --> 00:31:07.400
<v Scott>and I it's probably what it's

00:31:07.470 --> 00:31:09.440
<v Scott>become like I know Matt and I kind of crave

00:31:09.720 --> 00:31:11.500
<v Scott>a place to share photos

00:31:11.610 --> 00:31:13.400
<v Scott>with your friends that doesn't have advertisements

00:31:13.800 --> 00:31:15.580
<v Scott>that shows you actually what you did

00:31:16.020 --> 00:31:17.620
<v Scott>over the week and it's not a competition

00:31:17.940 --> 00:31:19.380
<v Scott>about the outfit you were wearing

00:31:19.970 --> 00:31:20.980
<v Scott>and like

00:31:21.200 --> 00:31:23.960
<v Scott>how good you can look on a Friday night with your friends,

00:31:24.150 --> 00:31:26.200
<v Scott>like this fake version of you.

00:31:27.180 --> 00:31:27.880
<v Matt>Specifically on that point,

00:31:28.120 --> 00:31:29.660
<v Matt>I'll give a huge shout out to Retro,

00:31:30.360 --> 00:31:33.700
<v Matt>which is a mobile app that specifically

00:31:33.900 --> 00:31:35.440
<v Matt>is just for sharing photos with friends

00:31:36.440 --> 00:31:37.600
<v Matt>on a week-by-week basis.

00:31:38.900 --> 00:31:39.980
<v Matt>On any given week,

00:31:40.010 --> 00:31:42.300
<v Matt>you upload the photos you want to share during that week

00:31:42.640 --> 00:31:43.660
<v Matt>and your friends can see them.

00:31:44.560 --> 00:31:46.880
<v Dillon>I can see more niche social platforms coming up

00:31:47.200 --> 00:31:48.120
<v Dillon>and getting more popular.

00:31:48.600 --> 00:31:51.280
<v Dillon>Cause I've found that like I actually use Strava,

00:31:51.400 --> 00:31:54.500
<v Dillon>which is a social platform, but it's very niche,

00:31:55.560 --> 00:31:57.800
<v Dillon>but I can still share photos and like chat with people.

00:31:58.200 --> 00:32:02.440
<v Matt>- Yeah, you probably won't see AI bots on Strava that likely

00:32:02.620 --> 00:32:04.620
<v Matt>because, well, I mean, you probably could,

00:32:05.740 --> 00:32:06.980
<v Matt>but you know, you kind of like,

00:32:07.040 --> 00:32:09.080
<v Matt>it would be interesting if Strava introduced a way that like,

00:32:09.420 --> 00:32:12.600
<v Matt>the only way you can post is if you'd go and do a workout

00:32:12.860 --> 00:32:14.520
<v Matt>or a run or whatever, you know, like that,

00:32:14.600 --> 00:32:16.740
<v Matt>that's the only way that you could create content.

00:32:16.920 --> 00:32:20.800
<v Matt>That would be interesting as like a barrier to prevent bots and whatnot.

00:32:21.520 --> 00:32:22.620
<v Dillon>Yeah, I don't think they'll do that.

00:32:22.720 --> 00:32:23.600
<v Dillon>But yeah, what do you have, Scott?

00:32:25.200 --> 00:32:27.720
<v Scott>The one thing I didn't like about retro, maybe it's not the same anymore.

00:32:27.850 --> 00:32:33.680
<v Scott>I just got back on it, Matt, was to see other people's posts.

00:32:34.370 --> 00:32:35.500
<v Scott>You yourself has to post.

00:32:36.360 --> 00:32:36.520
<v Matt>Yeah.

00:32:36.720 --> 00:32:43.760
<v Scott>And I think that's, I like it in the sense that like, I don't know, it's creepy if accounts never post, right?

00:32:43.920 --> 00:32:45.980
<v Scott>But also, I don't like the barrier to entry.

00:32:46.100 --> 00:32:50.480
<v Scott>Like it should be like, do your first post, not like you haven't posted this week, which

00:32:50.480 --> 00:32:53.900
<v Scott>I feel like is a little, little forceful and a kind of a turnoff.

00:32:53.930 --> 00:32:54.940
<v Scott>It's probably why I deleted it.

00:32:55.140 --> 00:32:59.800
<v Matt>But I agree, but I feel like it's good as like a forcing function to like, it's like

00:32:59.920 --> 00:33:02.880
<v Matt>kind of a little bit growth hacky kind of thing in the sense of like, okay, you can't

00:33:02.960 --> 00:33:04.480
<v Matt>see content if you don't also produce content.

00:33:05.040 --> 00:33:09.460
<v Matt>But I see it as like a, also like a forcing function for you to like be okay with what

00:33:09.490 --> 00:33:10.140
<v Matt>you share on retro.

00:33:10.860 --> 00:33:14.220
<v Matt>Whereas like most people will be like, well, this isn't like quality content.

00:33:14.320 --> 00:33:17.980
<v Matt>I don't want to upload this to my Instagram feed because it's just like the photo looks like shit

00:33:18.180 --> 00:33:21.940
<v Matt>or whatever. Right. But like, I feel like though they're trying to like break that barrier a little

00:33:21.940 --> 00:33:26.140
<v Matt>bit, just say like upload any photo, right? Like whether like it could be a photo you took of,

00:33:26.640 --> 00:33:29.600
<v Matt>I don't know, like your cat or something doing something weird. And it's like,

00:33:30.520 --> 00:33:34.120
<v Matt>just upload it. Like it doesn't have to be a good photo. Uh, just cause you're like sharing

00:33:34.200 --> 00:33:37.080
<v Matt>that moment. I feel, I don't know. I'm, I'm probably reading a lot more into it than

00:33:37.940 --> 00:33:42.920
<v Scott>maybe I'm already just, just too stigmatized. What was the lapse? That's another one that it was

00:33:42.940 --> 00:33:42.940
<v Scott>that like you take pictures at certain times in the day

00:33:42.941 --> 00:33:42.941
<v Matt>no that was that was a BeReal where

00:33:50.500 --> 00:33:50.500
<v Matt>everyone would get a notification to post a photo at like the same time every day and

00:33:50.501 --> 00:33:50.501
<v Matt>let's do a

00:33:54.390 --> 00:33:59.920
<v Matt>quick speed round on predictions if anyone has a lingering prediction i have like a very quick one

00:34:00.040 --> 00:34:00.040
<v Matt>that's us centric um but i'll let someone else go first if they have a quick one

00:34:00.041 --> 00:34:00.041
<v Scott>i got two i can go

00:34:07.640 --> 00:34:08.240
<v Scott>I can go quick.

00:34:09.520 --> 00:34:09.740
<v Scott>All right.

00:34:10.840 --> 00:34:13.760
<v Scott>Let's start with Python will become the top paid programming language.

00:34:16.540 --> 00:34:17.179
<v Scott>Due to AI.

00:34:18.370 --> 00:34:19.179
<v Scott>Due to machine learning.

00:34:19.620 --> 00:34:20.120
<v Matt>Not TypeScript.

00:34:21.159 --> 00:34:22.320
<v Scott>No, I think right now it's Rust.

00:34:22.700 --> 00:34:27.159
<v Scott>I think the reason why we'll see Python rise is because of machine learning.

00:34:27.330 --> 00:34:29.399
<v Scott>And I just see a lot of machine learning engineers.

00:34:30.000 --> 00:34:34.300
<v Scott>The AI engineer, like all these new names coming up for types of software engineers.

00:34:34.919 --> 00:34:41.120
<v Scott>and because python is like a precursor to doing a lot of actual ai work we'll see the pay band grow

00:34:41.379 --> 00:34:46.659
<v Scott>there i was going to say python and typescript um but i think typescript is kind of high i just

00:34:46.659 --> 00:34:46.659
<v Scott>think python is gonna we're gonna see a rise python

00:34:46.660 --> 00:34:46.660
<v Matt>well i just one quick note is i feel like typescript

00:34:51.659 --> 00:34:58.900
<v Matt>might replace python as the go-to for machine learning work which is maybe difficult for people

00:34:58.900 --> 00:34:58.900
<v Matt>to see but i think that'll happen

00:34:58.901 --> 00:34:58.901
<v Scott>i think it's possible i don't see it happening just yet but

00:35:04.480 --> 00:35:07.040
<v Scott>I agree that like the more universal languages TypeScript.

00:35:07.780 --> 00:35:08.400
<v Scott>All right, moving on.

00:35:09.720 --> 00:35:12.060
<v Scott>This one is my biggest prediction of the year.

00:35:12.640 --> 00:35:15.300
<v Scott>We will have at least one podcast this year

00:35:15.880 --> 00:35:17.560
<v Scott>where AI doesn't get mentioned.

00:35:19.440 --> 00:35:20.660
<v Matt>Yeah, I could see that.

00:35:21.640 --> 00:35:21.840
<v Dillon>No.

00:35:23.260 --> 00:35:24.220
<v Scott>I don't think we have yet.

00:35:24.600 --> 00:35:25.680
<v Scott>Just want to bring it up there.

00:35:26.200 --> 00:35:26.800
<v Matt>On a related note,

00:35:27.100 --> 00:35:29.340
<v Matt>my last prediction was that we're going to end up,

00:35:29.480 --> 00:35:30.760
<v Matt>we're going to stop talking about f***ing fare

00:35:30.860 --> 00:35:32.080
<v Matt>as often as we do by the end of the year.

00:35:33.760 --> 00:35:34.260
<v Matt>That's a good one.

00:35:35.060 --> 00:35:39.600
<v Matt>so kind of similar to the ai one it was like yeah maybe we finally get an episode out where we don't

00:35:39.600 --> 00:35:39.600
<v Matt>talk ***fair

00:35:39.601 --> 00:35:39.601
<v Scott>we bleep that out now it's like it's like an f-bomb on this channel

00:35:39.602 --> 00:35:39.602
<v Dillon>it's gonna be hard

00:35:45.580 --> 00:35:45.580
<v Dillon>because i feel like we've all spent more than half of our career there yeah yeah

00:35:45.581 --> 00:35:45.581
<v Scott>half of our career

00:35:53.280 --> 00:35:53.280
<v Scott>currently we our careers are short they're gonna be

00:35:53.281 --> 00:35:53.281
<v Matt>well they might they might end up being cut

00:35:59.240 --> 00:36:00.380
<v Matt>short because of AI.

00:36:01.010 --> 00:36:03.160
<v Scott>Oh, God. Oh, God damn it. Don't say that.

00:36:04.160 --> 00:36:05.160
<v Scott>Well, maybe I can just become a

00:36:05.340 --> 00:36:06.200
<v Scott>manager now, finally.

00:36:07.440 --> 00:36:08.540
<v Scott>I'll just manage the AI.

00:36:09.340 --> 00:36:11.140
<v Matt>No, you're the super mayor of

00:36:11.240 --> 00:36:12.580
<v Matt>your Gastown instance.

00:36:12.790 --> 00:36:13.220
<v Scott>Super mayor.

00:36:14.920 --> 00:36:15.220
<v Scott>That's right.

00:36:16.140 --> 00:36:18.240
<v Matt>All right. Dillon, any last predictions

00:36:18.530 --> 00:36:19.840
<v Matt>before we jump to stand-up?

00:36:20.580 --> 00:36:21.300
<v Dillon>No, I'm good.

00:36:22.300 --> 00:36:22.940
<v Matt>You're out-predictioned.

00:36:24.140 --> 00:36:25.960
<v Matt>Stand-up. I feel like it's been

00:36:26.170 --> 00:36:27.300
<v Matt>actually a while since we've done a stand-up.

00:36:28.580 --> 00:36:29.880
<v Matt>Scott, do you want to go first?

00:36:31.180 --> 00:36:31.300
<v Scott>Sure.

00:36:31.430 --> 00:36:33.220
<v Scott>I have nothing to say.

00:36:33.570 --> 00:36:34.320
<v Scott>I haven't thought about it.

00:36:34.460 --> 00:36:39.200
<v Scott>No, I'll go ahead and say, so I have canceled my Claude subscription, Matt.

00:36:39.500 --> 00:36:40.980
<v Scott>This is probably really big news.

00:36:41.760 --> 00:36:43.880
<v Scott>Not because I don't want to have it.

00:36:46.080 --> 00:36:48.460
<v Scott>I asked at work if the program would continue.

00:36:49.920 --> 00:36:50.540
<v Scott>They said no.

00:36:50.630 --> 00:36:52.160
<v Scott>I had a Claude Max subscription.

00:36:53.240 --> 00:36:54.240
<v Scott>I want to have a Claude.

00:36:54.280 --> 00:36:56.100
<v Scott>I so badly wanted to keep it,

00:36:56.220 --> 00:36:58.600
<v Scott>but I can't justify the little amount I was using.

00:36:59.120 --> 00:37:00.660
<v Scott>I think that Matt uses the,

00:37:01.260 --> 00:37:01.380
<v Scott>um,

00:37:01.380 --> 00:37:05.380
<v Scott>the pro version and that's like $20 a month for me.

00:37:05.380 --> 00:37:06.060
<v Scott>It said 17.

00:37:06.560 --> 00:37:07.780
<v Scott>I still got to figure out why.

00:37:08.280 --> 00:37:08.460
<v Scott>Um,

00:37:08.540 --> 00:37:08.780
<v Scott>anyway,

00:37:10.400 --> 00:37:10.940
<v Scott>that one,

00:37:10.940 --> 00:37:12.540
<v Scott>I feel like is like too small.

00:37:13.520 --> 00:37:14.600
<v Scott>I was looking at the sizes.

00:37:14.900 --> 00:37:18.100
<v Scott>Basically I need half of what the max subscription gives me.

00:37:18.760 --> 00:37:19.320
<v Scott>And like,

00:37:19.440 --> 00:37:24.240
<v Scott>I would happily pay Anthropic $50 a month as opposed to a hundred

00:37:24.260 --> 00:37:31.460
<v Scott>it um i will pick it back up basically i'm waiting to know i'm gonna have time to use it because i

00:37:31.540 --> 00:37:39.660
<v Scott>don't want to just um waste my time with it i i was literally using the Claude app to answer every

00:37:39.880 --> 00:37:44.880
<v Scott>question to the point where i'm like asking clawed everything things that i would never ask it and i

00:37:44.880 --> 00:37:49.480
<v Scott>was just like i gotta burn these tokens and it's obviously probably terrible for the environment

00:37:49.880 --> 00:37:54.220
<v Scott>but i don't like want to feel like i'm not justified in what i'm spending for the product

00:37:54.680 --> 00:38:00.120
<v Scott>it's worth it. But again, like I have it at work all day. Um, you know, not that I use it for

00:38:00.380 --> 00:38:05.700
<v Scott>personal stuff here, but, um, it was just hard for me with not a lot of time, free time to, to have

00:38:05.800 --> 00:38:09.840
<v Scott>like a really expensive version. So I might try out the pro, but I'm waiting until I actually need

00:38:09.870 --> 00:38:14.740
<v Scott>to use it. Um, and then if pros not enough, I might go back up to max, but basically I just

00:38:14.750 --> 00:38:19.060
<v Scott>don't have the time to be using it too much in the free time. So we'll see how it goes. Um,

00:38:19.360 --> 00:38:22.480
<v Scott>So that's my last AI take, hopefully, of the day.

00:38:23.010 --> 00:38:23.440
<v Scott>I don't know.

00:38:23.440 --> 00:38:26.000
<v Scott>At work, I'm merging code literally as we record this.

00:38:27.760 --> 00:38:28.640
<v Scott>Things are going good.

00:38:29.010 --> 00:38:30.720
<v Scott>We talked about our performance reviews.

00:38:31.720 --> 00:38:33.080
<v Scott>We're going to probably go skiing.

00:38:33.300 --> 00:38:35.600
<v Scott>There's a giant snowstorm coming this weekend.

00:38:35.910 --> 00:38:41.440
<v Scott>We're going to get walloped like 24 inches potentially of snow over the weekend.

00:38:41.770 --> 00:38:48.260
<v Scott>I had to go grocery shopping on a Friday morning because, you know, you got to get the bread and milk in Massachusetts.

00:38:49.440 --> 00:38:52.360
<v Scott>when it's gonna snow, everyone has to go.

00:38:53.080 --> 00:38:54.680
<v Scott>So I don't know, things are going good.

00:38:54.980 --> 00:38:56.820
<v Scott>Code is being written at work.

00:38:58.160 --> 00:38:59.660
<v Scott>I wanna just maybe call out that

00:39:00.340 --> 00:39:01.580
<v Scott>I keep talking to more engineers

00:39:01.900 --> 00:39:03.820
<v Scott>about how they feel like Claude does everything for them.

00:39:05.120 --> 00:39:07.140
<v Scott>And I'm still kind of doing this thing

00:39:07.940 --> 00:39:09.100
<v Scott>where when I'm interested in a problem,

00:39:09.760 --> 00:39:10.680
<v Scott>I try to solve it

00:39:10.940 --> 00:39:13.080
<v Scott>and I try to talk to Claude in planning mode

00:39:13.520 --> 00:39:16.840
<v Scott>and I try to like have it review the changes I make

00:39:16.840 --> 00:39:20.980
<v Scott>as I make them and then talk about possible alternatives. So I am still kind of doing my job,

00:39:21.420 --> 00:39:24.700
<v Scott>but when it's something entirely boring, say just building out tests for an application I haven't

00:39:24.840 --> 00:39:30.080
<v Scott>worked on before, I just have it eat out those tests. Now we do not want to just burn a zillion

00:39:30.220 --> 00:39:34.540
<v Scott>tokens on, on tests that might not be good. So I've been trying to also look at the code,

00:39:34.940 --> 00:39:39.600
<v Scott>get familiar with it and see if, um, you know, we can reduce some of the complexity before we just

00:39:39.740 --> 00:39:44.560
<v Scott>burn out all those tokens. Um, because basically at work, we wanted to increase the coverage of

00:39:44.940 --> 00:39:49.320
<v Scott>the product we're working on, but we didn't have the confidence in making changes. So we said,

00:39:49.440 --> 00:39:53.200
<v Scott>we'll build tests. And I've come to this realization. Yeah, that's all well and good.

00:39:53.290 --> 00:39:56.420
<v Scott>But if Claude just writes all the tests for you, are you really that much more confident

00:39:56.890 --> 00:39:59.820
<v Scott>in being able to make changes or that you're not just writing crappy tests?

00:40:00.320 --> 00:40:05.180
<v Scott>So I've been moving towards a more hybrid approach of like, let's, why don't I write some of the

00:40:05.340 --> 00:40:10.700
<v Scott>types and why don't I fix some of the small bugs and inconsistencies while I'm writing these tests

00:40:10.720 --> 00:40:14.560
<v Scott>or having Claude write these tests so that I can start to have a better understanding of how the

00:40:14.620 --> 00:40:19.340
<v Scott>whole thing works and have an ownership area. So that is probably way more time than I should have

00:40:19.540 --> 00:40:19.540
<v Scott>spent. Let me bounce it to Dillon.

00:40:19.541 --> 00:40:19.541
<v Dillon>I'm pretty sure my standup update was somewhere in what Scott said.

00:40:28.880 --> 00:40:28.880
<v Dillon>I'm kidding.

00:40:28.881 --> 00:40:28.881
<v Scott>I think I did a standup update for everyone.

00:40:34.660 --> 00:40:39.700
<v Dillon>This week at work, I've been messing around with like the skills in Claude. I'd never really used

00:40:39.720 --> 00:40:44.820
<v Dillon>them at all. I saw that Vercel dropped something called skills.sh, which is just like a library of

00:40:44.960 --> 00:40:52.140
<v Dillon>these skills, which are pseudo MCP server type things. Maybe Matt can fill in on that. But there's

00:40:52.140 --> 00:40:55.760
<v Dillon>a couple of them that have been kind of cool. There's one called context7 that I've been

00:40:55.860 --> 00:41:00.680
<v Dillon>using, which will like, go and find documentation and pull it in for you. And then I was playing

00:41:00.800 --> 00:41:06.839
<v Dillon>around with the Figma one and I connected it to like a design doc. And it like actually did a

00:41:06.860 --> 00:41:12.180
<v Dillon>pretty good job and I just pushed it up to PR and then one of the other people on my team was like

00:41:12.180 --> 00:41:16.360
<v Dillon>why are you doing this negative margin thing and like doing this other thing and it's because I

00:41:16.440 --> 00:41:21.920
<v Dillon>didn't really look at the code that it gave me before I went and asked for a review so uh yeah

00:41:22.840 --> 00:41:29.680
<v Dillon>it's been fun I feel like the more my company is like pushing me to use these tools the more I'm

00:41:29.680 --> 00:41:34.380
<v Dillon>just like I might as well just use them to their fullest extent and and see what I can get out of

00:41:34.340 --> 00:41:39.220
<v Dillon>them while they're here. Um, because if Matt's prediction comes true,

00:41:40.640 --> 00:41:42.160
<v Dillon>I'm not going to have these tools soon.

00:41:43.840 --> 00:41:47.500
<v Matt>I don't know. I feel like we all sort of need to push AI usage so much till we,

00:41:47.760 --> 00:41:50.600
<v Matt>till we realize that it's actually kind of shitty to learn a lesson to like

00:41:51.240 --> 00:41:54.660
<v Matt>scale back and find like a good sweet spot for us to do it. Um,

00:41:54.940 --> 00:41:57.560
<v Matt>I've been feeling this both at work and also on personal projects. So I've been

00:41:57.640 --> 00:42:00.760
<v Matt>like, I think past few episodes, I've been talking about this app I've been

00:42:00.820 --> 00:42:03.020
<v Matt>working on, uh, primarily written by Claude.

00:42:03.200 --> 00:42:06.660
<v Matt>I basically haven't reviewed a single line of the code it's generated for this app.

00:42:06.890 --> 00:42:09.520
<v Matt>And it's like thousands of lines of code at this point.

00:42:10.040 --> 00:42:13.200
<v Matt>It's a pretty cool app, but there's like bugs all the time.

00:42:13.230 --> 00:42:15.280
<v Matt>And I just like go to Claude and say, like, fix this bug or whatever.

00:42:15.850 --> 00:42:19.000
<v Matt>But now I'm like, actually, this is just a big pile of slop.

00:42:19.250 --> 00:42:28.360
<v Matt>And like, how do I go from, I don't really know what the code is doing to get a better understanding and like getting like sort of clean up the code a little bit.

00:42:29.100 --> 00:42:30.320
<v Matt>And then it's kind of similar at work.

00:42:30.440 --> 00:42:36.220
<v Matt>I feel like I leaned heavily into, oh, we have an agent that we can just, it can like spin up an issue for you and then open a PR for that issue.

00:42:36.860 --> 00:42:38.200
<v Matt>So I did that a few times the previous weeks.

00:42:38.280 --> 00:42:43.540
<v Matt>And it's like, eh, the quality of the PRs that it's generating are not that good at the moment.

00:42:44.040 --> 00:42:54.720
<v Matt>So I just need to like take a few and like a few breaths and like sort of evaluate like how do I apply better rigor to, you know, using code or using models to generate code.

00:42:55.260 --> 00:42:55.660
<v Matt>So I don't know.

00:42:56.460 --> 00:42:57.320
<v Dillon>It's not your problem.

00:42:57.520 --> 00:43:00.320
<v Dillon>You're supposed to use the coding reviewer skill

00:43:01.050 --> 00:43:04.180
<v Dillon>to then run five parallel agents.

00:43:04.300 --> 00:43:04.940
<v Matt>Yeah.

00:43:05.940 --> 00:43:07.700
<v Dillon>That's like one of the built-in features of that agent.

00:43:07.810 --> 00:43:08.460
<v Dillon>I was like, what?

00:43:10.860 --> 00:43:12.580
<v Dillon>I was joking with somebody at work and I was like,

00:43:12.900 --> 00:43:15.000
<v Dillon>dude, if you need a way to burn more tokens,

00:43:15.340 --> 00:43:15.840
<v Dillon>here it is.

00:43:16.440 --> 00:43:16.800
<v Dillon>You're welcome.

00:43:18.700 --> 00:43:21.020
<v Dillon>And I finally got like API like denied

00:43:21.380 --> 00:43:23.600
<v Dillon>on the Opus model on my Claude at work.

00:43:25.020 --> 00:43:25.420
<v Matt>Nice.

00:43:25.800 --> 00:43:28.200
<v Dillon>I'm not recommended model, which is trash.

00:43:29.180 --> 00:43:29.300
<v Dillon>Yeah.

00:43:31.640 --> 00:43:35.380
<v Matt>Outside of the slopageddon crisis,

00:43:37.380 --> 00:43:40.360
<v Matt>I've been trying to do a little bit of honeymoon planning,

00:43:41.360 --> 00:43:44.160
<v Matt>just sort of like looking at different locations that we might want to go to.

00:43:45.680 --> 00:43:48.800
<v Matt>But one of the places we were talking about going might be on the no travel list.

00:43:48.960 --> 00:43:52.260
<v Matt>So that's maybe taking it.

00:43:52.310 --> 00:43:53.500
<v Scott>I don't recommend Russia.

00:43:55.320 --> 00:43:55.760
<v Matt>Yeah.

00:43:55.880 --> 00:43:59.020
<v Matt>Yeah, Iran was really up there, top five.

00:44:02.860 --> 00:44:03.920
<v Scott>Saudi Arabia is nice.

00:44:05.260 --> 00:44:06.600
<v Matt>Thanks, everyone, for tuning in this week.

00:44:06.800 --> 00:44:09.700
<v Matt>Remember to leave a review for the episode and the show on your favorite podcast network.

00:44:10.180 --> 00:44:12.740
<v Matt>We appreciate 6.7 star reviews out of five the most.

00:44:13.320 --> 00:44:17.140
<v Matt>Remember, if the app or the service that you're using to get the podcast doesn't let you do that,

00:44:17.540 --> 00:44:19.440
<v Matt>it's a bug on their end, and you've got to report it to them.

00:44:20.460 --> 00:44:24.440
<v Matt>Feel free to join the BikeShed community Discord server as well to connect with other folks that listen to the podcast.

00:44:25.320 --> 00:44:26.220
<v Matt>They're also in the tech ecosystem.

00:44:26.820 --> 00:44:28.500
<v Matt>Share your thoughts with us and those other listeners.

00:44:29.700 --> 00:44:31.140
<v Matt>It's a little bit quiet in there.

00:44:31.200 --> 00:44:33.480
<v Matt>So you should join and participate.

00:44:34.620 --> 00:44:36.820
<v Matt>Additionally, if you enjoyed this episode or other episodes,

00:44:37.040 --> 00:44:38.600
<v Matt>please share it with your friends or enemies.

00:44:39.400 --> 00:44:40.000
<v Matt>We need more listeners.

00:44:41.200 --> 00:44:44.820
<v Matt>Dillon's down in the dumps because we're not getting enough listeners right now.

00:44:45.540 --> 00:44:46.440
<v Matt>It's getting a little bit sad.

00:44:47.080 --> 00:44:47.780
<v Matt>We need more people.

00:44:48.440 --> 00:44:50.060
<v Matt>Anyway, see you next week.

00:44:50.620 --> 00:44:50.960
<v Matt>Peace out.

00:44:51.360 --> 00:44:51.740
<v Dillon>Peace.

