Full Video:
Feature Availability & Access
When will these features be available for Windows users?
OpenAI is actively developing the Windows version of these features, with plans to release them in early 2025. The Work with Apps integration, currently exclusive to Mac, will be brought to Windows with similar functionality and app support. The development team, led by Kevin Weil, has prioritized ensuring feature parity between platforms.
How do I update my ChatGPT desktop app to get the new features?
The ChatGPT desktop app automatically checks for updates when launched, but users can manually trigger an update through the app’s settings menu. Once available, a notification will appear prompting the installation of new features like Work with Apps. OpenAI pushes updates gradually to ensure stability, so some users might receive updates at different times.
Which subscription tier do I need to access these new features?
Work with Apps and the enhanced desktop features require a ChatGPT Plus or Enterprise subscription. OpenAI has confirmed that these advanced integrations, including IDE support and document collaboration features, aren’t available in the free tier. Enterprise users get additional administrative controls and security features for team-wide app integration.
Is the “Work with Apps” feature available internationally?
Work with Apps is rolling out globally to all regions where ChatGPT Plus and Enterprise subscriptions are supported. OpenAI has ensured that the integration works with international versions of supported apps like Notion, Apple Notes, and Quip. The feature maintains consistent functionality across different regions, though some third-party app availability might vary by location.
App Integration
Which specific apps are currently supported?
OpenAI’s Work with Apps feature supports a wide range of applications including multiple IDE platforms (Xcode, VS Code, JetBrains ecosystem including Android Studio, PyCharm, and RubyMine), traditional text editors (TextMate, BBEdit), MATLAB, and document applications (Apple Notes, Notion, and Quip). Additionally, terminal applications like Warp are fully supported with special integration features.
How secure is the app integration feature?
ChatGPT’s desktop app implements a permission-based system where users maintain complete control over app access. As demonstrated by Justin Rushing, ChatGPT won’t look at any app’s contents until explicitly granted permission through the “Work with Apps” interface. The integration uses MacOS accessibility APIs to establish secure connections with supported applications.
Can ChatGPT access my apps without permission?
No, ChatGPT requires explicit user permission before accessing any application’s content. The product team, led by Kevin Weil, emphasized that users must actively select an app from the “Work with Apps” menu before ChatGPT can see or interact with its contents. This permission-based approach ensures user privacy and control over data access.
How do I enable/disable app permissions?
Users can manage app permissions through the ChatGPT desktop app’s interface by clicking the “Work with Apps” button. The system shows all currently running supported applications, and users can enable access by selecting specific apps from this menu. To disable access, users can simply deselect the app or close the integration session, ensuring complete control over what ChatGPT can and cannot access.
Technical Implementation
What are the system requirements for these features?
ChatGPT’s desktop app is designed to be lightweight and resource-efficient, as emphasized by Justin Rushing during the demonstration. The app runs natively on MacOS, requiring minimal system resources while maintaining full functionality. OpenAI has optimized the app to run smoothly alongside other applications without causing performance issues.
How do I set up keyboard shortcuts?
The ChatGPT desktop app comes with two pre-configured keyboard shortcuts: Option+Space for general access and Option+Shift+1 for direct app integration. As demonstrated by John Nastos, the Option+Shift+1 shortcut automatically pairs ChatGPT with the currently active supported application, streamlining the workflow.
Does the app integration work offline?
The Work with Apps feature requires an internet connection to function, as it utilizes OpenAI’s models for processing and generating responses. The app maintains connection with OpenAI’s servers to provide features like Advanced Data Analysis, web search integration, and voice mode capabilities.
How resource-intensive is the desktop app?
Justin Rushing specifically highlighted that the ChatGPT desktop app is “really lightweight” and “doesn’t use a lot of resources.” Being a fully native application rather than a browser-based solution, it’s optimized to run efficiently in its own window while allowing users to seamlessly switch between tasks without performance impact.
IDE & Development Support
Which IDEs are officially supported?
OpenAI’s Work with Apps feature supports a comprehensive range of IDEs, including Xcode, Visual Studio Code, and the entire JetBrains ecosystem (Android Studio, PyCharm, RubyMine). John Nastos demonstrated that traditional Mac text editors like TextMate and BBEdit are also fully supported. MATLAB integration has been specifically added with students and academic users in mind.
Can I use multiple IDEs simultaneously?
ChatGPT’s desktop app allows users to work with multiple IDEs simultaneously through the “Work with Apps” menu. The system, as shown by John Nastos, automatically detects running supported IDEs, and users can switch between them using the Option+Shift+1 shortcut to quickly pair ChatGPT with the currently active IDE window.
How does the code suggestion feature work?
The code suggestion feature leverages OpenAI’s models (including GPT-4 and Claude) to analyze the code context and provide solutions. As demonstrated during John’s Xcode session, ChatGPT can observe the code through MacOS accessibility APIs, understand the programming context, and generate appropriate code solutions, including complex implementations like observers and event handlers.
What programming languages are supported?
The Work with Apps feature supports programming languages compatible with its supported IDEs, including Swift (demonstrated in Xcode), Python (PyCharm), Ruby (RubyMine), and languages supported in VS Code. The integration works with the IDE’s native features while providing additional AI-powered assistance through ChatGPT’s understanding of multiple programming languages.
Document Integration
How does the Notion/Apple Notes/Quip integration work?
ChatGPT’s Work with Apps feature connects directly with these document applications through MacOS accessibility APIs. As demonstrated by Justin Rushing with his walking tour document in Notion, users can highlight specific sections of text, and ChatGPT can understand the full document context. The integration allows for seamless two-way interaction, with users able to copy suggestions directly back into their documents.
Can ChatGPT maintain document formatting?
OpenAI’s integration preserves document formatting when working with Notion, Apple Notes, and Quip. During Justin’s demonstration, ChatGPT was able to analyze the existing document style and maintain consistent formatting when generating new content for his walking tour, ensuring the new text matched the tone and structure of previous sections.
Is there a limit to document size or length?
The demonstration didn’t explicitly address document size limitations, but ChatGPT showed the ability to work with both selected portions and entire documents. Justin highlighted specific sections while allowing ChatGPT to reference the full document context for style matching and content understanding.
How does the style matching feature work?
ChatGPT analyzes the existing document content to learn the writer’s voice and writing style. As shown in Justin’s tour guide example, users can specifically request style matching with commands like “make it match the style of the rest of the stops,” and ChatGPT will generate content that mimics the tone, length, and writing patterns of the existing document.
Voice Feature
How do I activate the voice mode?
Voice mode can be activated through the microphone icon in the bottom right corner of the ChatGPT desktop app interface, as demonstrated by John Nastos. The feature works seamlessly with the Work with Apps integration, allowing users to have voice conversations while working with documents or code.
What languages are supported in voice mode?
The demonstration only showcased English interaction with the special holiday-themed Santa voice persona. While OpenAI didn’t explicitly outline language support during the session, the voice mode appears to be an extension of the advanced voice features available in the desktop app.
Can I switch between text and voice mid-conversation?
John Nastos demonstrated that users can freely switch between voice and text input during the same session. The interface maintains both the microphone icon for voice input and the traditional text input field, allowing users to choose their preferred input method at any time during the interaction.
How accurate is the voice recognition?
During John’s demonstration with the holiday setlist in Apple Notes, the voice recognition showed high accuracy in understanding musical terminology and song titles. The system maintained context throughout the conversation, though John noted a small transcription error that needed correction during the demo.
Data Privacy & Security
What data can ChatGPT access from my apps?
ChatGPT can only access data from apps that users explicitly authorize through the Work with Apps menu. As Justin Rushing emphasized during the demo, ChatGPT uses MacOS accessibility APIs to read content from supported applications, but only after receiving explicit permission. The system can access both visible content and full document context when authorized.
How is my data protected when using app integration?
OpenAI has implemented a strict permission-based system that prevents ChatGPT from looking at any app’s contents until explicitly granted access. The integration works through MacOS’s native accessibility APIs, ensuring secure and controlled access to application content. The desktop app maintains user privacy by requiring active selection of apps before any data access occurs.
Can I revoke access to apps after granting permission?
Users maintain complete control over app permissions and can revoke access at any time by deselecting apps from the Work with Apps menu. The integration session can be ended by closing the connection or switching to a different app, giving users full control over when and how ChatGPT accesses their applications.
Is my document content stored by OpenAI?
The demonstration didn’t explicitly address content storage policies, though the system appears to operate on a session-by-session basis, requiring fresh permissions each time. When using features like web search integration, ChatGPT references external sources to verify information but maintains the context within the current session.
Advanced Features
How does the Advanced Data Analysis work with different apps?
During Justin’s terminal demonstration, ChatGPT seamlessly integrated Advanced Data Analysis to process git commit data and create holiday-themed visualizations. The feature works across supported apps, effectively making Advanced Data Analysis available to any integrated application, as Justin noted, “when we build features like Advanced Data Analysis and bring them to ChatGPT, it’s kind of like we’re bringing them to every app that ChatGPT works with.”
Can I use GPT-4 and Claude models with all integrated apps?
OpenAI demonstrated that users can select different models, including O1 and O1 Pro, when working with integrated apps. John Nastos specifically highlighted O1’s capabilities with complex coding problems, showing how it expertly handled Xcode integration and accessibility API implementations. The system allows users to switch between models based on their specific needs.
What’s the difference between browser and desktop app capabilities?
The desktop app provides significantly more capabilities than the browser version, as Kevin Weil emphasized that “being a desktop app you can do so much more than you can just in a browser tab.” The native app can see screen contents (with permission), automate desktop tasks, and integrate directly with local applications, while maintaining a lightweight footprint.
How does the web search integration work with document editing?
Justin demonstrated web search integration while writing his walking tour in Notion, where enabling the search feature allowed ChatGPT to fact-check and ground its responses in citations. Users can click through the provided links to verify information, creating what Justin called “an awesome interaction loop” where ChatGPT assists with research while maintaining document context.
Troubleshooting with the APP
What should I do if app integration isn’t working?
Based on John Nastos’s live demo troubleshooting, first ensure your ChatGPT desktop app is updated to the latest version. When John encountered an issue with the Xcode integration, he resolved it by discarding changes and trying again with a different approach. The feature requires proper permissions and app selection through the Work with Apps menu before integration can begin.
How do I resolve keyboard shortcut conflicts?
ChatGPT’s desktop app uses Option+Space for general access and Option+Shift+1 for direct app integration. As demonstrated by Justin Rushing and John Nastos, these shortcuts are designed to be distinct from common system shortcuts. If conflicts occur, users can access the same features through the Work with Apps button in the interface.
What are common error messages and their solutions?
The demonstration showed that when code implementation didn’t work as expected in Xcode, John Nastos resolved it by trying a different model (switching to O1) and requesting a new solution. The system provides visual feedback when apps are successfully connected, making it clear when integration is working properly.
Future Development
What new features are planned for 2025?
Kevin Weil revealed that ChatGPT will become increasingly “agentic,” moving beyond simple Q&A to actively doing tasks on users’ behalf. OpenAI plans to expand desktop automation capabilities, allowing ChatGPT to perform more actions with user permission. The team is focusing on features that leverage the desktop app’s ability to see and interact with screen contents.
Will there be support for additional third-party apps?
The rapid addition of Warp terminal support following user requests demonstrates OpenAI’s commitment to expanding app integration. Justin Rushing highlighted how features like Advanced Data Analysis automatically become available to newly integrated apps, suggesting OpenAI’s intention to continue growing their app ecosystem beyond the current IDE and document editing applications.
Are there plans for mobile app integration?
The demonstration focused exclusively on desktop applications, with Kevin Weil emphasizing the unique capabilities that come from being a desktop app rather than a browser-based solution. No specific mentions were made about mobile integration plans during the presentation.
How will the “agentic” capabilities expand?
Kevin Weil explained that ChatGPT will evolve to actively perform tasks rather than just provide answers, similar to how Canvas currently helps improve writing and code. The desktop app’s ability to see screen contents and automate desktop work (with permission) will enable more advanced autonomous capabilities.
Full Transcript
(00:03) [Music]
Kevin Wheel: Hey everybody, welcome to day 11. I’m Kevin Wheel, I lead product at OpenAI, and I am definitely outclassed by the two gentlemen on my right, who I’m told did not just get these uh suits 48 hours ago on Amazon. Definitely, definitely own this already, 100%.
Kevin Wheel: Yeah, all right. So you might have noticed we’ve been putting a lot of effort into our desktop apps. Uh, so we we launched our Mac desktop app about 6 months ago. We launched our Windows desktop app just a couple months ago. And as our models get increasingly powerful, Chat GPT will more and more
(00:41) Kevin Wheel: become agentic. Uh, and that means we’ll go beyond just questions and answers. Chat GPT will begin doing things for you. We’re seeing that already uh with products like Canvas where uh you’re collaborating with Chat GPT to help improve your writing and your code. And that shift will continue and we’ll do chat GPT will do more and more on your behalf. The desktop apps are a big part of that too, because being a desktop app, you can do so much more than you can just in a browser tab. That includes things like, with your permission of
(01:14) Kevin Wheel: course, being able to see what’s on your screen and being able to automate a lot of the work that you’re doing on your desktop. So we’ll have a lot more to say on that as we go into 2025. But we’ve also got some exciting stuff that we’re launching today. So let’s dive in.
John Nastos: All right, hi, I’m John Nastos and I work on the Chat GPT desktop team.
Justin Rushing: My name is Justin Rushing, and I also work on the Chat GPT desktop team.
Justin Rushing: Uh, we’ve got a lot to show you today, so I’m just going to go ahead and get started here. Um, so first things first, this right
(01:45) Justin Rushing: here is the fully native Chat GPT desktop app for Mac. Uh, it does all the things that you know we’ve come to expect from our clients. Um, but what I really love about it is that, being native, it’s really lightweight, doesn’t use a lot of resources. It, uh, it lives its own window and, um, it, I’m able to use it without having to context switch away from what I’m already doing. Right. So we’ve got this keyboard shortcut, option space, that makes it really fast to show and hi Chat GPT. So it’s, it’s always there when you need it. This button right here is our entry
(02:17) Justin Rushing: point for working with apps on your computer. And the way that I like to think about this feature is that we all copy and paste things into Chat GPT, right? All the time, all the time, all the time. Um, this feature makes that way smoother by, when we are working with an app on your computer, we’ll automatically pull that context in for you. So you just focus on asking your question, and we, we handle the rest. So you might notice that I’ve also got this Warp uh console window open as well. It’s currently navigated to a repository that, that I’m
(02:47) Justin Rushing: getting up to speed on. Um, and it might seem kind of silly, but I want to figure out how many commits per day are happening in this repo. You know, we talk about velocity a lot here, so I want to see that for myself. I have no idea how to do that though, so I’m going to use Chat GPT. So when I click on this button, I’m going to see all of the apps that are currently running on my computer that Chat GPT can work with. Uh, important note, until you select one of these, we will never look at the contents of another app, so you are always fully in control
(03:17) Justin Rushing: over what you’re sharing with Chat GPT. So, to get started, I’m going to click on Warp. And at this time, huge shout out to the Warp team for all of their help in getting this going. Uh, when we first announced working with apps, we did not have support for Warp, and it was, I think, literally the first request was adding support. Um, so huge shout out to the team. They worked really hard to help us uh get it ready for today, so thank you. So I’m going to get started by saying uh write a command to get the number of commits per day over the past two months.
(03:51) Justin Rushing: And now, I don’t need to tell Chat GPT that I use Git because it can tell from Warp that I do, um, and it’s just going to give me the command that I need. So I’ll push this button to copy and paste it into Warp, and I, I think this looks right. Yeah, it’s looks like the right information, but it’s also kind of hard to tell what we’re looking at, right? Yeah, I’m, I’m a visual learner myself, so, um, normally what I would do is I’d figure out how to get this into a spreadsheet, make a chart there, and then find that spreadsheet again in three years. But um,
(04:18) Justin Rushing: instead, I’m just going to, uh, I’m just going to ask for one. So make a bar graph with all of the results. Why not make it holiday themed?
Justin Rushing: Great idea. And, uh, awesome. So this is going to show off what I think is the coolest part about working with apps, which is that it works with all of the other features and all of the other models in Chat GPT. So, in this case, uh, for decided to use Advanced Data Analysis to crunch some numbers and give me back a bar graph. And what that means, if you really think about it, is that when we build features
(04:50) Justin Rushing: like Advanced Data Analysis and bring them to Chat GPT, it’s kind of like we’re bringing them to every app that Chat GPT works with.
Kevin Wheel: Yeah, that’s great. So while it thinks about this, um, do you want to talk a little bit about what the model is actually seen? Is it just what we see on the screen, or is it something more?
Justin Rushing: Great question. So an easy way to do this would be to just grab a screenshot and and let Vision do the rest. Um, but we actually can reach into the application to grab ascreen content as well, and so uh these
(05:16) Justin Rushing: results will contain everything here, not just what you see on screen.
Kevin Wheel: Yeah. I was thinking hard about this. It might be the the holiday themed part.
Justin Rushing: Okay, perfect. All right, I mean, this is, that looks pretty holiday themed to me. What do you think, John?
John Nastos: Uh, it’s holiday themed. I don’t know if it’s as holiday themed as we are in these suits, but it’s not bad.
Justin Rushing: Literally nothing is as holiday themed as you are.
John Nastos: Yeah, perfect, perfect.
Justin Rushing: But I’d say it’s good enough, so I’m just going to download this, and now I can share it with a teammate. But with
(05:45) Justin Rushing: that, I’m going to hand it back to John to talk a bit more about programming.
John Nastos: Great. So I think that the use case that Justin showed is really important and useful to be able to interact with a terminal, but I want to show what it’s like to interact with code in an IDE. So I have Xcode open here, which is my IDE of choice, and it’s running a sample app that is actually a little bit of a peek behind the scenes into how this work with apps feature works. The sample app uses the Mac OS accessibility APIs to look at Xcode and tell us some things
(06:17) John Nastos: about what’s on the screen. So it’s telling us that there’s a text field with these dimensions. It tells us that it has 37 lines, and we can go down and check that, 37 lines, yep. And it shows us the content of the text field, and we actually use this to make the feature, right?
Justin Rushing: That’s right, yeah.
John Nastos: This is a useful sample app for us for sure. So, this is nice, but it doesn’t do any live updating. So I’m going to use Chat GPT to help add that feature. I’m going to bring up the chat bar with a very similar shortcut to what Justin showed earlier, but with a
(06:47) John Nastos: slight change. I’m going to use Option Shift 1, and what that does is it brings up the chat bar with Xcode automatically paired to it, Xcode being the topmost app that is open that we support with this feature. It makes it super quick to start working with an app.
Justin Rushing: Yeah, it’s great. And you get this immediate feedback that it sees Xcode here.
John Nastos: So, these accessibility APIs are a little bit inscrutable, definitely hard to remember how to use and pretty complicated actually. Um, so I’m going to use the model selector here, and I’m
(07:19) John Nastos: going to switch this to O1. O1 is one of our new newer models here at Open AI, and it does a great job thinking about these difficult coding problems. Um, and I should mention as well that uh this feature is also available with O1 Pro if you really want to throw the deep end coding problems at that model.
Kevin Wheel: Yeah.
John Nastos: All right, so let’s give it a prompt here. I’m going to say add an observer uh so if selection changes, load text areas is called, and we’ll kick off this request to the model. So, O1 is one of our chain of thought models, and you can see that
(08:02) John Nastos: it’s thinking about this issue. It’s going to tell us some of the steps that it goes through as it considers and wow that was a pretty fast response from it. Didn’t have to think too hard on that one I guess.
Justin Rushing: I guess not, you got to give it a harder problem next time.
John Nastos: Yeah wow. All right, so it’s generating some code and you know I have a fair amount of uh of trust in O1’s code here, so as soon as it’s finished generating I’m just going to copy this into Xcode and we’re going to run it and see what happens. I don’t, I don’t see anything that could go wrong
(08:30) John Nastos: with that.
Justin Rushing: No demos poss.
John Nastos: Yeah live demos work 100% of the time, it was one of the rules of the universe. All right, so I’ve copied that code and I’m pasting it directly into Xcode. I’m going to take a quick scroll through it to see if it’s finding any issues, right now it’s looking pretty promising. All right, so let’s run this and see what happens.
Justin Rushing: You know it would be really cool if you didn’t have to copy and paste that back into Xcode though. That would be cool.
John Nastos: And you know people have been suggesting that. Should I build that?
Justin Rushing: You should definitely build that.
(09:01) John Nastos: All right PM approved, great. All right, so uh let’s, it’s running again, uh let’s take a look. If I select content, oh no, it didn’t, it didn’t work like we thought. Okay, should we, should we give it one more shot?
Justin Rushing: Yeah, why don’t we, why don’t we ask?
John Nastos: Yeah. All right, so I think I’m actually just going to go back to our previous state, since I don’t have a specific error here. Uh, let’s try to discard the changes. All right, let’s give it one more shot.
Justin Rushing: Yeah one more shot.
John Nastos: Yeah and while it’s working we can uh talk about some more
(09:38) John Nastos: of the features here. Okay, add an observer so if selection changes, load text areas is called again.
Justin Rushing: All right, maybe it didn’t think hard enough on that we’ll try it again.
John Nastos: Got overconfident.
Justin Rushing: Yeah.
John Nastos: Well it thinks about this, I should mention that I’m using Xcode, uh like I said this is my IDE of choice when working with Swift, but we do support a whole bunch of other IDEs. Um that means VS code, uh the Jet Brains ecosystem which includes Android Studio and PyCharm, Ruby Mine things like that. Some really uh standby Mac apps like
(10:18) John Nastos: TextMate and BBEdit. So we’ve got a, a whole lot of different support here.
Justin Rushing: Yeah, I’m, I’m actually unreasonably excited for MATLAB support. I would have totally used that in college.
John Nastos: Yeah, MATLAB is, is another exciting one. I think some students are really going to find that useful. Okay, it’s uh it is still generating some code. There it goes. It’s done. I’m going to use this copy button again, and again with full trust that everything is going to work, I’m going to, now we know what could go wrong though, right?
Justin Rushing: Yeah, sure.
(10:48) John Nastos: All right, so let’s run this again, and uh see if we have slightly better luck. Okay, it’s running. Hey, look at that. If I things, it changes. Wow, it’s a holiday miracle. You got the incantation to the demo Gods, right, the second time.
Justin Rushing: Yeah, exactly.
John Nastos: Awesome. So we’ve been talking a lot about coding today, right? Uh, but another thing that I love to use Chat GPT for is helping me with my writing, and I know I’m not alone here. Um, and so that’s why today we’re announcing support for three new applications: Apple Notes, Notion, and
(11:23) John Nastos: Quip. Uh, we think this is going to open up a brand new set of use cases for working with apps, and so we can’t wait to see what you all do with it. Uh, with that, uh, John, Kevin, you already know this, but for the rest of you, uh, I give walking history tours in San Francisco, outside of, outside of work. Um, big history buff, San Francisco’s got a great story to tell, and I’m actually working on a brand new walking tour. And so, why don’t we try out this feature and help me out with that?
Kevin Wheel: Let’s do it.
John Nastos: Great. So here I have a Notion document open on my
(11:54) John Nastos: computer. I always write my tours in Notion, and this is actually a real walking tour that I’m working on, so I, I hope you all find it interesting. Um, but I’m actually working on a new stop for my favorite character in San Francisco history, Emperor Norton. Um, I know some high-level talking points. He was the self-proclaimed emperor of the United States and protector of Mexico, who lived in San Francisco in the 1800s. And uh he even made his own currency that was actually valid in the city for a while.
Justin Rushing: That’s something you
(12:23) Justin Rushing: can just do?
John Nastos: Yeah, apparently apparently you can. And, uh, I think it’s going to make for a great tour stop, but I’m a little bit fuzzy on the details, and so I’m going to use Chat GPT to help me out. Uh, one option would be to copy and paste these bullets over, and I think chat would do a pretty good job at at going with that. But it would be helpful if it had context of the entire document, right? And so instead, I’m going to have Chat GPT work directly with Notion. So I’m going to hit option space to bring up Chat GPT, have it work with Notion, and
(12:51) John Nastos: I’m actually going to go ahead and just, um, highlight this stop here, so that the model knows what to pay attention to. And so now we can see we’re working with Notion in the walking tour document, focused on selected lines, and I’m just going to go ahead and say fill out these talking points. Right. Don’t need to be any more specific than that. Um, but one thing that is really important is that this is a walking tour, right? This is a history tour. Things need to be factually correct, and so to help with that, I’m going to push this button to turn on
(13:21) John Nastos: search. And now, to answer my question, Chat GPT is going to search the web, and everything it tells me is going to be grounded in citation, right? And so I want to find out more about, I can go ahead and click the links and you really start to see this awesome interaction loop pop up where Chat GPT is helping me with my research in the context of the document I’m writing. So awesome. This looks like all of the stuff that I’m hoping to cover. And so it doesn’t really sound like me though. This sounds like, you know, official results, and so I’m going to
(13:53) John Nastos: turn off search and just say uh make it match the style of the rest of the stops, keep it short, two paragraphs. And now, Chat GPT is going to go out and read the rest of my document, learn how, how I talk, and how I’ve written the rest of these, and do its best to imitate that. And so awesome. This, this looks great. Let me introduce you to one of San Francisco’s most beloved characters, you’ll have to come to the tour to find out the rest.
Justin Rushing: Sounds like you.
John Nastos: So I’m just going to go ahead and highlight this to copy and paste it back
(14:28) John Nastos: into Notion, and of course, you know, I want to iterate from here, I’d want to refine this. Um, but that’s just a quick example of using Chat GPT to work with Notion.
Justin Rushing: That’s awesome. I think that it’s really compelling to work with your documents like this. Um, not just code, like I showed before, but your uh written pros. Um, this is excellent. But it’s just one way to work with the model, this sort of text in text out method, and what I’d like to introduce today is support to use advanced voice mode with this feature. We’re really excited about this.
(15:01) Justin Rushing: We think that this paradigm is a really interesting way to talk with Chat GPT and get insight into your documents or your code for that matter. All right, so we’re going to show one of mine in a second here. Um, when I’m not at OpenAI, I spend a lot of time as a professional saxophone player, and I’ve got this holiday party set list that I’d like some input on. I think Chat GPT would be great about this, but again, I’m, I want to use the voice to do this. Is this why your holiday parties are all so good, by the way?
Kevin Wheel: No, that’s a different. Okay okay
(15:30) Justin Rushing: Is it the suits? Yeah, the suits help for sure, yeah. All right, so uh yeah, I want someone special to really help out with this. Um, so let’s bring up Chat GPT again with the shortcut that I mentioned earlier, which is Option Shift 1, and that brings up our chat bar. It’s automatically paired with Notes, but this time, instead of typing to the model, I’m going to use this new advanced voice mode support. So I’ll use this icon in the bottom right to bring in our special guest.
John Nastos: Great.
Justin Rushing: Hey Santa, are you there?
Santa: Yes, I’m here, how can I help you
(16:13) Santa: today?
Justin Rushing: Santa, get it is, so great to talk to you. Listen, I’ve got this holiday set list that I’m working on, and I’ve got a couple songs that I’m just not sure about. This one, I’ve got My Love to Keep Me Warm. I honestly don’t really know that song very well. Can you recommend something else?
Santa: How about replacing, “I’ve got my love to keep me warm” with “Frosty the Snowman”?
Justin Rushing: Oh, that’s, that’s a good idea. Let’s see, The Snowman. Did I get that right?
Santa: Not quite. Oh keep going.
Santa: Sure thing, so is there anything else you’d like to know about your
(16:56) Justin Rushing: holiday set list?
Justin Rushing: I I think Santa was bored with the copy editing but we can move on. Uh let’s talk about the last song of the second set. Auld Lang Syne. I mean, that’s a little bit dreary. I’m hoping for something more energetic to really get you and the elves dancing. What do you think? What should I replace it with?
Santa: Deck the Halls would be a perfect replacement. It’s jolly, festive, and sure to get everyone dancing.
Justin Rushing: Santa, I think that’s a great idea. I think that song’s really going to slay. I think you know interacting with the
(17:32) Justin Rushing: the voice model like this is such a great way to talk to your documents and get live feedback on what you’re doing.
John Nastos: Totally. And and uh good news, uh right before we stepped on this live stream actually, we pushed the button to ship all of the features we just showed off. So um they should be available for you on your computer, just make sure that you are updated to the latest Chat GPT app for Mac OS.
Kevin Wheel: Awesome.
Kevin Wheel: So that’s day 11. It’s about using the desktop apps to see, understand, and automate the work that you’re doing with Chat GPT. Uh, this is
(18:04) Kevin Wheel: coming, this is available today as of what 20 minutes ago on our Mac desktop apps, coming soon to Windows. I love the desktop apps that they’re, they’re in the background when you’re doing work and then it’s a keyboard shortcut and they’re right there and able to help you do whatever you’re working on. So that’s day 11. We have one day left. We’ll be coming to you tomorrow morning, day 12. We got something super exciting, so don’t miss it.
John Nastos: Yeah, we can’t wait to get these features out to you. We’re really excited. But in the
(18:32) John Nastos: meantime, I’ve got to start practicing this.
Justin Rushing: Yeah that Santa recommended.
Kevin Wheel: All right let’s see. [Music]