We are happy to announce a new release of the Elvis InDesign plug-in 3.2.0 for Elvis Cloud!
Elvis Cloud, the digital asset management solution for creative teams, is now available with a connector for Adobe InDesign CC. It is already a great tool for managing and organizing your digital design work, as well as sharing files with a simple URL or via the built-in Brand Portal.
In the new Elvis InDesign plug-in, you can open layouts directly from your browser, as well as drag and drop images from Elvis straight onto your designs. Amongst other time-saving features are automated check-outs and centralized versioning.
Opening an InDesign layout within Elvis Cloud directly from your preferred web browser, edit it in some way in InDesign and store the changes back to Elvis Cloud simply by saving and closing the layout (see below).
Placing a file directly from Elvis Cloud onto an InDesign layout. All selected files will be loaded to the InDesign Place Gun from which you can place them in the regular manner.
Editing placed images. Images that are stored in Elvis can be easily opened for editing from within InDesign and stored back into Elvis.
Saving a placed image in a different file format. When editing an image it is possible to save it in a different file format, for example by turning a Photoshop file into a JPEG file or vice versa.
Elvis can check out the layout you are working with, lock it, create and save a new version automatically.
Elvis Cloud can be simply switched on and is ready to use instantly. From the menu the InDesign plug-in can be installed to connect your creative apps.
Elvis DAM has been built to make life easier for our customers. We built it with a REST API included - which enabled users to connect with other software solutions. But today, we’re excited to announce the launch of an additional, game-changing API.
Here’s a warm welcome to Webhooks API - and a brief introduction to the limitless opportunities it’s about to provide Elvis DAM users. Plus, at the end of this post, a recording of the webinar we did, showing Webhooks in action.
Webhooks VS REST API
If you’re unfamiliar an API (Application Programming Interface) allows two digital tools to connect with each other. It essentially takes information from one software solution and passes it on to another.
In the simplest terms, one of the key benefits the REST API provides users is to run searches across Elvis and deliver results back to the other software solution. Webhooks is set to change this process by sending realtime, triggered notifications based on events that happen within Elvis.
An example of how it works
Let’s say you want to connect a CRM solution to Elvis. You might decide you only want assets which are set to status ‘approved’ in Elvis to get pulled into the CRM.
Using the REST API, you would run a search to retrieve approved assets from Elvis. To ensure the assets were as up-to-date as possible you might automate a schedule, so that it would search every 5 minutes for example.
This works well to deliver assets but it can involve unnecessary processing (every time a scheduled search runs without retrieving new assets this is in effect using unnecessary processing power.
With Webhooks, you could schedule a notification that alerts the CRM the instant an asset is set to approved. This ensures processing power is only spent when and where it’s needed, and that the CRM will always contain the latest, most up-to-date assets.
So which is better?
Well actually, this is much less of a clash-of-the-titans and more of a combined solution that opens up a whole new world of opportunities for integrating Elvis with other systems. Both Webhooks and REST API solve a different part of the integration-puzzle. The Webhook API notifies users based on realtime events, whereas the REST API allows you to interact or access data in Elvis.
3 Key Webhooks benefits
Realtime notifications either for users or to automate system synchronization
Preventing unnecessary processing and helping to keep systems running optimally
Alerting users about asset removal - typically difficult for a REST API to identify
Potential use-cases for Webhooks
User notifications - Webhooks can alert users when files are added to collections, downloaded or if sharelinks are created for example. You can connect Webhooks to generate notifications through any channel, like email or chat messages slack for example.
External information retrieval - you can retrieve information in realtime to append assets with richer data in Elvis DAM. This would work for things like Image Recognition AI software being used to enrich asset metadata - or for example could be used to automate the translation of metadata within Elvis.
Synchronizing two systems - if you’re aiming to keep another system like your CRM or PIM synchronized with Elvis, Webhooks ensures it happens seamlessly. This is especially useful for things like deleted files which aren’t usually detected through the REST API.
Want to learn more about the opportunities Webhooks can bring?
Take a look at this webinar recording to see how easy it is to add a Webhook to Elvis DAM. The sample integrations shown will give you a good idea of the possibilities Webhooks and the REST API together can bring.
For more technical information about the opportunities Webhooks can provide, check out our Help Center.
Here at WoodWing we pride ourselves on our unique ability to create highly efficient solutions for all types of digital and print publications, and we do this by working very closely with our customers, partners and various experts that specialize in industry market research and trends.
During our latest webinar, expert in the media industry, Andreas Pfeiffer, presented his insights on trends and challenges for magazine publishers, which included having to adapt to new production methods. He mentions how even though the print publication still dominates, all of the other channels have become essential too.
His insights touch on the problem areas experienced by magazine publishers including duplication of effort among teams, not repurposing content effectively, and the need for publications to future-proof their systems.
The benefits of having a managed workflow system is highlighted, which includes increasing efficiency, easy reuse of assets, optimizing collaboration and automatically generating digital outputs.
Watch the Webinar on how to speed up your publishing process.
Following on from Andreas’ presentation, a sneak peek into the next generation of WoodWing’s Enterprise was shown.
The upcoming version has taken into account the main problems reported by publishers:
The content they currently create is not optimized for repurposing on any channel, and
Their current workflow does not allow for easy reuse of content
The demonstration shows an example of a digital story and how content can be easily added into various components. These components is what gives a story its structure and without even realizing it, the editor has created a story that can easily be reused.
The instant article feature displays how the story is going to look on a range of devices (including iPhone and iPad) and the focus point feature demonstrates how the image can be amended to ensure the story looks attractive on any digital device.
It is explained that the story can be published to various channels including out-of-the-box integrations such as Adobe AEM, Facebook Instant Articles and Apple News. Third party plugins that are supported include Pugpig, Twixl and Purple from Sprylab, however custom channels can also be integrated with the publish API.
The demonstration of the print layout shows how the placed objects are in different stages of the workflow with their color highlights. The ability to easily edit and tweak the story through Enterprise’s Content Station, Adobe InCopy or InDesign is also explained.
A new workflow approach
Typically in the past, print magazines publishers have created the print article first and then had to manually apply structure to be able to reuse for digital channels. A long and tedious exercise.
In the demonstration it can be seen how content is applied to a story in a structured way, so that reuse on any channel is possible, with simple tweaks. The workflow approach becomes channel-neutral, where variants of the story are adapted to suit the publishing channel.
expert Andreas Pfeiffer and Woodwing for an exclusive, free webinar
Attention all magazine media publishers and
creators of amazing content, here is a webinar designed especially for you. Woodwing
has teamed up with renowned industry expert Andreas Pfeiffer from Pfeiffer
Consulting for a 30-minute power webinar.
provide participants with deeper insight into (including but not limited to):
Adapting magazine media workflows to include digital
Streamlining the revision and publication process
Using web-based solutions to speed up team collaboration
specialists in creating the most efficient multi-platform publishing solutions
for top magazine media companies worldwide with their next generation
Andreas’ presentation, the company will provide a brief, exclusive sneak peek
into how the next generation of their Enterprise solution can be used to create
digital and print content simultaneously.
and Woodwing for this once-off, exclusive and free webinar.
in providing market-specific research and consulting to media producers and
technology providers, Andreas has experience in driving magazine publications
to success. His work has been highlighted in
newspapers and magazines such as the New York Times, the International Herald
Tribune and Wired.
We’ve created a two-part blog series about Artificial Intelligence & Digital Asset Management after testing the most advanced tools available. This post looks at the key features the major image-recognition tools provide and shows how they can be applied within Elvis. If you are not familiar with Elvis DAM yet, please learn more about how it can enhance your file management here: “What Is Elvis DAM.”
We’ve been testing the functionality of the three key players in image-recognition AI - Google Vision, Amazon Image Rekognition and Clarifai.
Each tool has its own merits and provides different sources of data that can enhance the usability of your DAM. AI and Elvis were practically built for each other - not only were these integrations simple, they’ve also been incredibly effective very quickly. See it for yourself in the demo below.
The key features
This is the core functionality that Image Recognition AI provides DAM users. As you see in the video, we upload files within Elvis and the API automatically includes the metadata in the returned assets within Elvis.
At this point, it’s always important to validate the metadata - and build on it with additional information your users might need. From here users can either search, or filter by the tags you approve - providing multiple options to find exactly what they’re looking for. This could save thousands of hours (and a lot of frustration) for DAM administrators everywhere!
Clarifai has consistently delivered the richest, most varied metadata throughout testing, but the other tools are catching up quickly.
In our demo we’ve created specific fields for the different tools so you can see what each provides. You see already that there is a big difference between the type of metadata that is retrieved so when evaluating the different tools, it’s important to relate this back to your requirements check-list.
Imagine you’ve come across the perfect image you need online and you need to find something similar in your own database. Simply drag and drop that image to the plugin search function and Elvis will automatically deliver a bank of images with the same tags. The image you searched with will not even added to Elvis, meaning you keep a clean database of your own assets.
Here, we can filter by characteristics like mustaches to find the exact image we need.
If your image library features a lot of people, being able to break it down by searching for things like emotion, or personal characteristics can save significant amounts of time.
Amazon shines in this area, and with a simple integration within Elvis, it’s easy to keep this metadata as a separate field. By doing this it makes it very easy to refine searches by filtering, as you might with taxonomies.
Today’s AI software is powerful enough to identify major landmarks - and Google Vision will even append this with their exact co-ordinates. This benefits of this are twofold:
1. Users can search for the landmark by name without administrators having to manually add it
2. The asset can automatically be added to the Maps integration that already exists as standard within Elvis’ sample plugins.
In a similar way to the image-search feature, if you’ve already found a great image within Elvis but need more options, you can just right click and search similar by installing a simple plugin. With the extensive tags provided by AI this will help ensure users return the greatest volume of relevant results.
Image recognition and Elvis
These are just some of our favourite ways that AI may enhance DAM usability, but there will be plenty more to come as the technology evolves. Elvis is evolving just as quickly, and these integrations will be publicly available very soon.
Keep up to date with the latest WoodWing news to make sure you don’t miss any announcements, or contact us with any specific questions you may have.
We’ve created a two-part blog series about Artificial Intelligence & Digital Asset Management after testing the most advanced tools available today. Part 1 provides an overview of AI technology and the 3 things all new DAM users need to know before getting started.
The past few years have been incredibly exciting for AI. Tech giants like Google, Microsoft, Amazon and IBM have been investing heavily in the technology and it’s evolving rapidly (that’s one of the key benefits of machine-learning after all!).
Today’s AI has been used to enhance search engines, self-driving cars, personal phone assistants - and yes, now DAM usability. But in order to make it one of the most time-saving additions to your DAM toolbox, it’s important to first understand what you can get out of it and the limitations of the technology.
That’s why we’d like to share 3 key lessons we’ve discovered whilst testing the most advanced image recognition tools available to DAM vendors today.
AI Lesson #1 - the technology is narrow, but it’s about to expand your metadata.
It may sound futuristic, but current AI technology is known to experts as ‘narrow AI’. Simply put, it means that it is designed only to perform limited tasks. Take for example a search engine algorithm or a self-driving car - the technology is used to achieve one clear task.
Facial analysis from Amazon Rekognition, one of the most advanced tools on the market (image courtesy of AWS).
When it comes to DAM, this task revolves around Image Recognition Technology. The technology can identify a host of criteria within every image uploaded - from landmarks to emotions on individual faces.
These criteria are ranked on a scale of accuracy as you see on the image above. Users are able to set the parameters - so if you want a lower volume of highly accurate characteristics, you might set an accuracy level of 90% or above.
The identified criteria can then automatically be added to an image’s metadata. This instantly builds a much larger and richer range of terms to help users search out the perfect asset.
AI Lesson #2 - don’t trust robots blindly.
As sophisticated as image-recognition might appear or sound, it’s still in its infancy. It hasn’t yet matured (emotionally or analytically).
For example, we’ve observed that current software easily identifies positive emotions but struggles to detect negative ones. Beyond that, analysis can sometimes miss the mark completely - like identifying a man as a woman, or a lizard as a tree.
Fortunately machine-learning evolves very quickly. The more images analysed, the more accurate and robust the results become. And as the big image-recognition providers analyse billions of images every day, they’re catching up very rapidly.
So whilst the technology may lighten the workload, the role of DAM administrators is as pivotal as ever to validate and enhance the metadata provided by the technology. And regardless of how sensitive the robots become, the relevance of attached metadata will always be defined by your end-user needs.
AI Lesson #3 - define your AI requirements before selecting a vendor.
When looking to acquire a DAM, the first part of the process is always to list the specific requirements and functionality required. Why? Because different product features help you to meet those exact needs.
It helps you to evaluate what will be most useful to you, without getting swept away in dazzling sales presentation.
The same is true with AI vendors.
We’ve been testing the functionality of the three key players in Image Recognition AI - Google Vision, Amazon Rekognition and Clarifai. As you’d expect, they provide a lot of data that can be really useful to DAM users. But each have strengths in different areas.
Making sure you can make the most of this technology means considering how it can be used, practically, by the people who use your DAM every day. This way you can avoid getting swept up in the excitement of what the technology can do, rather than what will make it useful for you.
The best way to achieve this is to speak to your core-users. Identify where the pain-points are when searching and which searches they feel they’re losing most time on. From here, build a list of key criteria for selection:
Is our image collection varied enough to put this tool to use?
What are the main searches being conducted?
Do we have different categories of images (people, places, buildings, objects)
What kind of searches would make people’s lives easier (location-based, sentiment)
When you have a clear picture of what you’re looking for, it’s time to evaluate the tools.
Speaking of which, check out Blog #2 to learn more about what they offer - and how they can enhance the usability of your DAM.
Content Station 10 (CS10) continues to meet publishers’ needs with new features that include opening articles in Adobe InCopy, adding notes to a dossier and an option to copy files.
Opening articles in InCopy
Increase productivity by configuring your CS10 to open articles in Adobe InCopy. Previously, articles would automatically open in the Print Editor of CS10, however it is now possible to add, into your workflow, the step of opening articles within InCopy and editing from there.
A dossier is the main content storage tool in CS10 for editorial teams. Updates to the dossier feature allows users to add notes and messages via a chat box. Clearly indicate what the story is about, what the dossier should be used for or what content it should contain.
Thumbnails now include text
When viewing files in thumbnail mode, a text snippet now shows the first 250 characters. Providing visible information for the user is just another way CS10 reduces production time for fast publication.
Shortcut to Copy Files
Lastly, a simple but very useful update is the right-click shortcut for copying one or more files to another location.
This year CS10 users will receive access to the Content Station Digital Editor. Storytellers will be able to easily access consumers on multiple digital channels, and satisfy the demanding needs of today’s audience.
Content Station 10 (the most advanced version of Content Station yet) is built entirely in HTML and replaces the Adobe Flex based version. It has been optimized to provide an enhanced experience for content creators.
For more information regarding the development of the Content Station Digital Editor or article creation using Content Station Print Editor please get in touch.
In the real world, we’re taught that you never get more for less. A product’s value is calculated as a simple equation of quality over cost. But the digital world is infinitely more complex.
Whether it’s because the internet was created as a free-exchange of information, or perhaps because digital products are seen to be so easily updated (after all, it’s just another line of code, right?), our demands are higher than ever.
Add to that, the ease and speed at which competitors are setting up shop and software vendors are constantly pushed to create more for less.
But I want to stress, this is not a complaint. This is one of the most exciting challenges software developers must acknowledge and use to strengthen their product offering.
Here’s a look at how these contrasting demands are shaping the DAM landscape in 2017.
Flexibility and freedom, with added security
The availability of cloud products in our everyday lives means we’re all getting used to having access to the things we need, whenever and wherever we are. This demand for on-the-go access is no different for DAM users.
But cloud convenience comes with a risky reputation. For the average person, the convenience of cloud services generally outweighs the security risks. But when it comes to businesses, leaving their assets vulnerable is simply out of the question.
It’s important to note, cloud solutions are backed by an enormous amount of data security. For example, Elvis Cloud - WoodWing’s standalone cloud product - relies on Amazon Web Service technology. This alone uses over 1,800 security measures.
Private cloud solutions offer freedom and flexibility with added security.
But as business data policies get stricter, they demand specific, configurable security measures. This means businesses are demanding a greater say in the cloud service provider they use.
This is why we’ve seen a rise of configured, hybrid solutions that we like to call Private Clouds. For our customers, that means taking and hosting the full Elvis environment with a chosen cloud provider, in the country of choice.
This ensures flexibility and scalability of a cloud product with a provider that can be configured to specific business requirements.
Less labor, more metadata
The more care and attention you pay to the metadata of your assets, the easier it is for your users to find what they’re looking for. But as your collections grow from thousands and into millions of assets, simple searches deliver large volumes of results for users to filter through.
To make life easier, users will search by specific, narrow keywords to reduce the results volume. But for your DAM to support these terms you need a lot of metadata to ensure they arrive in the results. This can be a long and very subjective task for DAM administrators or librarians.
There are a variety of ways to resolve this challenge. Taxonomies provide a structured way to add metadata to assets. But when it comes to tagging emotions or other highly relevant but subjective fields, it has typically required a manual approach and a lot of time.
Is the girl anxious, or disappointed? Deciding how to tag this image would have a huge impact on whether it would be found and used in the future.
Many of the major search engines are using AI to automate categorization of images. There are a few key tools on the market that can analyze an image and identify a mountain from a desert, or a happy face from a sad one.
These tools signal a more efficient way for administrators to handle batch-tagging of assets with relevant metadata. For users it means finding exactly what they’re looking for, in less time.
But despite the industry hype, AI is far from a perfect solution at present - Google’s “gorilla” AI error is testament to that. It’s a useful addition to the DAM toolbox to enhance and speed-up manual metadata processes, but let’s not pretend we’ve lost the need for a human perspective just yet.
We’ve been working on ways to enhance the current AI solutions, ensuring you gain accurate and useful metadata whilst minimizing time spent manually adding it. You’ll see some really cool stuff in the coming weeks, watch this space.
Greater use, less visibility
Ever felt a bit overwhelmed by the sheer number of tabs, apps and programmes you are running? These tools can be incredibly helpful when it comes to achieving specific tasks. But managing so many processes at once can actually increase distractions, making us less productive and reducing our attention spans.
Across a business, a DAM’s full functionality is typically used by just a few departments, such as Marketing. But its benefits are multiplied when every department uses the same central database of approved assets. In cases like this, the more you can reap your DAM rewards without pushing users out of other tools, the more efficient it can be.
Keeping people focused on one task or tool helps minimize distractions and boost productivity
Reducing time users spend jumping between tools helps them focus on their tasks and means they’re more likely to search for the latest assets, rather than relying on old ones they have to hand. This is integral to getting the most value out of your assets.
WoodWing’s roadmap promises see huge leaps and innovation in supporting DAM functionality within other tools. Take for example our Salesforce integration - enabling teams to share assets with customers or internally without ever leaving the platform.
Manual decision making can be quite a slow process, based on evaluation of different scenarios and outcomes.
But in an era where we’re practically swimming in available data, we increasingly rely on numbers to help us speed up these decisions and work more quickly.
Data overload does not make for effective decision-making
When it comes to your assets, knowing how they’re being used and how effective they are helps you make smarter decisions. But as the number of tools we use grows, so too does the volume of available data. That’s why simple, visual analytics have become critical for effective decision making.
Take WoodWing Enterprise users that also use Elvis DAM for example. The analytics within the integration not only uses simple visuals to demonstrate how often an asset has been used; it details where, how it has been cropped and the publication title.
This is essential for knowing at a glance if an asset is available for reuse in other publications or markets, or whether an alternative would be better.
This exciting, evolving landscape underpins Elvis DAM’s roadmap of features and integrations for the coming year. Keep an eye out for developments that support these themes, we’re excited to share them with you as soon as possible.
All new for Digital Editors using Inception, you can now add components and text to your articles directly from your iPad.
In the past this was only available through Inception on your computer, however, it is now possible to add, edit and delete components while on your daily commute or away from your laptop. Features such as undo and redo have also been incorporated.
The (green) add component button is displayed on the left hand side of the Inception Digital Editor for ease of use and so not to interfere with writing the story.
Inception is the content creation tool that allows users to generate responsive articles and customize to their individual style or brand. Publish articles to your app, website and social media channels without having to manage separate online tools and apps.
Engagement. It’s the name of the game for Keith Barraclough at Texture.
And how does he do it? By reaching users on a number of platforms, grabbing their attention and driving engagement to the Texture app.
Texture started in 2010, and has been described as the Netflix of the magazine world. Their app has over 200 of the top magazine titles in North America and allows readers unlimited access to the expansive catalog of magazines for one monthly price.
In short, they bring the world’s greatest magazines to life on the digital device of your choice. And in this role they have been highly successful.
They publish at various levels including article level, collection level and full issue level, engaging readers through customization and providing choice throughout the digital experience.
Using WoodWing Inception technology to ensure articles are responsive and available on all devices, Keith does not underestimate the value of automation and workflow technology.
“Majority of our publishing partners have some elements of the WoodWing ecosystem in their publishing”, he explains.
WoodWing Inception technology allows editors to efficiently reuse, republish and repurpose articles, as well as creating responsive content with a workflow system.