{"id":95488,"date":"2023-07-27T15:07:12","date_gmt":"2023-07-27T13:07:12","guid":{"rendered":"https:\/\/www.hiig.de\/?p=95488"},"modified":"2024-01-31T15:35:42","modified_gmt":"2024-01-31T14:35:42","slug":"inside-hugging-face","status":"publish","type":"post","link":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/","title":{"rendered":"Inside Hugging Face"},"content":{"rendered":"\n<p><strong>To understand the dynamics of current open-source machine learning research, one platform is of central importance: Hugging Face. For accessing state-of-the-art models, Hugging Face is the place to be. In this post, I analyse how models on this platform changed during the last years and what organisations are active on Hugging Face.<\/strong><\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Hugging Face, not OpenAI<\/strong><\/h2>\n\n\n\n<p>While much of recent reporting on AI is focused on OpenAI, the company behind ChatGPT, another company that is much more relevant to the everyday life of a Machine Learning (ML) developer, receives much fewer attention:<a href=\"https:\/\/huggingface.co\/\" target=\"_blank\" rel=\"noreferrer noopener\"> Hugging Face<\/a>. Hugging Face is a platform that hosts ML models including Large Language Models (LLMs). Literally everyone can upload and download their models on this platform and it is one of the main drivers for democratising research in ML. Therefore, understanding what actors and models are on Hugging Face, tells us much about current open-source research in ML.&nbsp;<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>The company behind the emoji<\/strong><\/h2>\n\n\n\n<p>Hugging Face is a young company that was founded in 2017 in New York City. Its original business model was providing a chatbot. However, nowadays it is mostly known as a platform that hosts ML models. At first only language models but in the recent months vision, speech and other modalities were added. Besides free hosting, it also provides some paid services such as a hub to deploy models. Hugging Face is currently<a href=\"https:\/\/www.forbes.com\/sites\/kenrickcai\/2022\/05\/09\/the-2-billion-emoji-hugging-face-wants-to-be-launchpad-for-a-machine-learning-revolution\/\" target=\"_blank\" rel=\"noreferrer noopener\"> rated at two billion dollars<\/a> and recently partnered with<a href=\"https:\/\/www.reuters.com\/technology\/amazon-web-services-pairs-with-hugging-face-target-ai-developers-2023-02-21\/\" target=\"_blank\" rel=\"noreferrer noopener\"> Amazon Web Services<\/a>.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Models are becoming larger<\/strong><\/h2>\n\n\n\n<p>Contributions to Hugging Face are constantly increasing and repository sizes are growing over time (Fig 1). This means that increasingly large models are uploaded, which mirrors recent developments in AI research. Many of the models are provided by companies, universities, or non-profits but most come from entities that provide no information (\u201cOther\u201d). Manual inspection shows that it includes most often individuals who uploaded one or several models but also organisations that simply did not add their information.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh6.googleusercontent.com\/Qj4u29Xfe33orZMpnnBDNQjBJIAOAysJ7Y5j0QZhnpnJpikVf2SVlVtoOsPg_hYvUXBk5RPZBo2u442aCHVbooEyMPpdfeL0lz--mQYiOFkkW-XQUgZKtQRl6X99GPKAms6A9vmgPT13xL29VKGHhME\" alt=\"\"\/><figcaption class=\"wp-element-caption\"><em>Fig 1 Uploaded models over time and their repository sizes in Gigabyte (n = 151,296). Y-axis in log scale.<\/em><\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>What types of organisations are active on Hugging Face?<\/strong><\/h2>\n\n\n\n<p>Who uploads the largest models and whose models are most wanted? The biggest group in terms of uploaded models is \u201cOther\u201d (Fig 2). The remaining groups, except of \u201cclassroom\u201d, are fairly close to each other. This means that universities, non-profits, and companies contribute about equally on Hugging Face.<\/p>\n\n\n\n<p>The recent surge in ML research is mostly driven by Big Tech and elite universities because training large models is extremely resource-intensive in terms of compute and money. Accordingly, one would expect that most large models come from companies or universities and it is also their models that are downloaded the most. But this is only partly the case. Models by companies and universities are most wanted. However, while models uploaded by companies are also relatively large, models by universities are not. The largest models are uploaded by non-profits, however, this is mostly due to a few models by <a href=\"https:\/\/laion.ai\/\" target=\"_blank\" rel=\"noreferrer noopener\">LAION<\/a> and <a href=\"https:\/\/bigscience.huggingface.co\/\" target=\"_blank\" rel=\"noreferrer noopener\">Big Science<\/a>, whereas each model is more than hundred gigabytes.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh6.googleusercontent.com\/55UbQPL2e7urb0IruvubQuz1KDv10Z_ZjZkgbmHSCoI0_ZC1BYWKm7lsws6qPSauLqdpAkSz3vz76rh7HMsNTUn4Cw0ijreBoDqbAH5w0lNMA8EboCpGSPI5_tFHeYM6j67UlK9ATDUKnW8YlwCK3Rw\" alt=\"\"\/><figcaption class=\"wp-element-caption\"><em>Fig 2 Uploaded models, repository size, and count of downloads by (self-assigned) organisation type. X-axis log scale.<\/em><\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Upload and download are not proportional<\/strong><\/h2>\n\n\n\n<p>What are the individual organisations that contribute the most on Hugging Face? For universities, the <a href=\"https:\/\/www.helsinki.fi\/en\" target=\"_blank\" rel=\"noreferrer noopener\">University of Helsinki<\/a>, who provides translation models for various language combinations, is leading by far (Fig 3). However, when it comes to downloads, the Japanese <a href=\"https:\/\/www.naist.jp\/en\/\" target=\"_blank\" rel=\"noreferrer noopener\">Nara Institute of Science and Technology (NAIST)<\/a> is leading. This is mostly due to <a href=\"https:\/\/huggingface.co\/sociocom\/MedNER-CR-JA\" target=\"_blank\" rel=\"noreferrer noopener\">MedNER<\/a>, their named entity recognition model for medical documents that is also among the top 10 of the most downloaded models on Hugging Face. German universities are among the organisations with the highest model count but also the most downloads.<\/p>\n\n\n\n<p>The non-profit repositories with the most models should not be trusted too much, since three of them are entertained by a single PhD student. However, the organisations in this category with the highest download rate are well-known non-profit organisations like LAION and BigCode, that contribute a lot to the open-source community.<\/p>\n\n\n\n<p>Little surprising, Big Tech companies lead the industry category with respect to upload counts. However, it is neither Google nor Facebook who have the highest download rate but <a href=\"https:\/\/huggingface.co\/runwayml\/stable-diffusion-v1-5\" target=\"_blank\" rel=\"noreferrer noopener\">Runway with their Stable Diffusion model<\/a>.<\/p>\n\n\n\n<p>The top 10 of most downloaded models is led by <a href=\"https:\/\/huggingface.co\/jonatasgrosman\/wav2vec2-large-xlsr-53-english\" target=\"_blank\" rel=\"noreferrer noopener\">Wave2Vec<\/a>, a speech recognition model. It is surprising that even though Hugging Face was long known as a platform for unimodal language models, many of the most downloaded models are bimodal for text and either vision or speech.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh4.googleusercontent.com\/rKkGxhGpVggHQrfslVjB34RuUuQtc8-txn4fk6FCnbRmgnP3MUA2KTLFnoLxhKmEqd6vg-3YhDx49xElngtqXAEf62mn73r7P4Ig_BJfn4cmIYeq5SceiKZbwjt_WQN7S-NVaBBZ_Twt_GwioZRokz4\" alt=\"\"\/><figcaption class=\"wp-element-caption\"><em>Fig 3 Organisations with the highest upload\/download count.<\/em><\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Open-source LLMs are on the rise<\/strong><\/h2>\n\n\n\n<p>Latest since the launch of ChatGPT, LLMs found their way into public attention. However, in line with a recently<a href=\"https:\/\/www.semianalysis.com\/p\/google-we-have-no-moat-and-neither\" target=\"_blank\" rel=\"noreferrer noopener\"> leaked Memo<\/a> from a Google-developer, open-source is becoming a serious competitor for LLMs. Hugging Face started a<a href=\"https:\/\/huggingface.co\/spaces\/HuggingFaceH4\/open_llm_leaderboard\" target=\"_blank\" rel=\"noreferrer noopener\"> leader board<\/a> for open LLMs and some of them reach performance close to ChatGPT.<\/p>\n\n\n\n<p>Model size, measured in Parameters, increased in recent years drastically (Fig 4). While in 2021 only few models reached 3 billion parameters, currently there are many more and larger open models, reaching almost 70 billion parameters.<\/p>\n\n\n\n<p class=\"has-text-align-left\">The organisation type that uploads the most LLMs is \u201cOther\u201d. However, this is mostly driven by a few actors, who uploaded sometimes even more than 15 models each. The group that uploads the second most LLMs are companies and they upload LLMs with an average of 10 billion parameters. Universities on the other side do not only upload the fewest LLMs but also the smallest on average.<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"1200\" height=\"720\" src=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Fig4-1200x720.png\" alt=\"\" class=\"wp-image-95491\" srcset=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Fig4-1200x720.png 1200w, https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Fig4-800x480.png 800w, https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Fig4-60x36.png 60w, https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Fig4-768x461.png 768w, https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Fig4-180x108.png 180w, https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Fig4-50x30.png 50w, https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Fig4-550x330.png 550w, https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Fig4-600x360.png 600w, https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Fig4-1536x922.png 1536w, https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Fig4-1320x792.png 1320w, https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Fig4.png 2000w\" sizes=\"auto, (max-width: 1200px) 100vw, 1200px\" \/><figcaption class=\"wp-element-caption\"><em>Fig 4 Parameter count of Large Language Models (LLMs) over time and by organisation type (retrieved from Hugging Face\u2019s LLM Leaderboard).<\/em><\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Large models, large emissions. Small models, small emissions<\/strong><\/h2>\n\n\n\n<p>Good news first: training most of the models emitted less CO<sub>2<\/sub> than streaming in 4K quality for one hour. The bad news is that only a bit more than 1% of the models at Hugging Face provide this information. 1706 of these 1846 models that include emission information are from \u201cOther\u201d. However, 99.5% of the emissions come from Non-Profit models. This means that training only a few models lead to the biggest share of emissions. This emphasises that the <a href=\"https:\/\/www.hiig.de\/en\/sustainable-ai\/\" target=\"_blank\" rel=\"noreferrer noopener\">major part of emissions comes from large projects and not from smaller ones<\/a>. But it also emphasises that it needs more transparency and emissions must be documented more consistently.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/lh6.googleusercontent.com\/H87bKEKr3kXMY-Zj9sZt7lnWEvsc1nVp9K713BmQCHo7oxUvjACGoJbNo2Qno8vON4_Wwq4it-RwkiwQMYYiS16_kYOErix0dJjt16Ugh7d7uuUZ04LzDTDcXlxUEZnqncmxYpM07Hb60GiKws5nEag\" alt=\"\"\/><figcaption class=\"wp-element-caption\"><em>Fig 5 Training emissions (CO<sub>2<\/sub>\/kg) for each model and by organisation type. Selection is restricted to repositories that include their emissions in their model card (n = 1846). Pie chart displays the share of repositories that included this information by organisation type. X-axis log scale.<\/em><\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Summing up<\/strong><\/h2>\n\n\n\n<p>Hugging Face is the main platform for distributing pre-trained ML models. Without a platform like this, most smaller AI projects would not have access to state-of-the-art models. In a way Hugging Face reflects the current trend in ML research: The field is dominated by a few actors who have the resources to train increasingly large models. However, even though training large models is restricted to actors with sufficient resources, Hugging Face enables projects with less resources to make use of these models. Moreover, even though large companies draw most attention, there is a vivid base of small- and mid-sized projects producing constant output and only small emissions.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Method &amp; limitations<\/strong><\/h2>\n\n\n\n<p>All data was scraped beginning of July 2023. As progress happens fast in ML research, the numbers might not be fully accurate already. For information on the models, Hugging Face\u2019s own API was used. Information on the organisations was retrieved with a custom scraper. Users on Hugging Face have the option to upload information sheets, cards as they call them. These cards provide information on organisations, datasets, or models and they are the main source that informs this article. All information on organisations is self-assigned and there are a lot of empty repositories and fake-organisations. For example, while Hugging Face offers more than 260,000 models according to their search engine, only about 150,000 of these repositories are larger than ten megabytes. Since ML models require much memory, it is unlikely that these repositories actually contain models. I believe that the general trend is correct but some details might be inaccurate. The detailed information on individual organisations were retrieved manually and to the best of my knowledge.<\/p>\n\n\n\n<p><em>All data can be viewed via <a href=\"https:\/\/www.google.com\/url?q=https:\/\/github.com\/SamiNenno\/Inside-Huggingface&amp;sa=D&amp;source=docs&amp;ust=1690464409020082&amp;usg=AOvVaw2xh-tC3KX4vfqSqp1b96TA\" target=\"_blank\" rel=\"noreferrer noopener\">https:\/\/github.com\/SamiNenno\/Inside-Huggingface<\/a><\/em><\/p>\n<div class=\"shariff shariff-align-flex-start shariff-widget-align-flex-start\"><ul class=\"shariff-buttons theme-round orientation-horizontal buttonsize-medium\"><li class=\"shariff-button linkedin shariff-nocustomcolor\" style=\"background-color:#1488bf\"><a href=\"https:\/\/www.linkedin.com\/sharing\/share-offsite\/?url=https%3A%2F%2Fwww.hiig.de%2Fen%2Finside-hugging-face%2F\" title=\"Share on LinkedIn\" aria-label=\"Share on LinkedIn\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0077b5; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 27 32\"><path fill=\"#0077b5\" d=\"M6.2 11.2v17.7h-5.9v-17.7h5.9zM6.6 5.7q0 1.3-0.9 2.2t-2.4 0.9h0q-1.5 0-2.4-0.9t-0.9-2.2 0.9-2.2 2.4-0.9 2.4 0.9 0.9 2.2zM27.4 18.7v10.1h-5.9v-9.5q0-1.9-0.7-2.9t-2.3-1.1q-1.1 0-1.9 0.6t-1.2 1.5q-0.2 0.5-0.2 1.4v9.9h-5.9q0-7.1 0-11.6t0-5.3l0-0.9h5.9v2.6h0q0.4-0.6 0.7-1t1-0.9 1.6-0.8 2-0.3q3 0 4.9 2t1.9 6z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button bluesky shariff-nocustomcolor\" style=\"background-color:#84c4ff\"><a href=\"https:\/\/bsky.app\/intent\/compose?text=Inside%20Hugging%20Face https%3A%2F%2Fwww.hiig.de%2Fen%2Finside-hugging-face%2F  via @hiigberlin.bsky.social\" title=\"Share on Bluesky\" aria-label=\"Share on Bluesky\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#0085ff; color:#fff\" target=\"_blank\"><span class=\"shariff-icon\" style=\"\"><svg width=\"20\" height=\"20\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 20 20\"><path class=\"st0\" d=\"M4.89,3.12c2.07,1.55,4.3,4.71,5.11,6.4.82-1.69,3.04-4.84,5.11-6.4,1.49-1.12,3.91-1.99,3.91.77,0,.55-.32,4.63-.5,5.3-.64,2.3-2.99,2.89-5.08,2.54,3.65.62,4.58,2.68,2.57,4.74-3.81,3.91-5.48-.98-5.9-2.23-.08-.23-.11-.34-.12-.25,0-.09-.04.02-.12.25-.43,1.25-2.09,6.14-5.9,2.23-2.01-2.06-1.08-4.12,2.57-4.74-2.09.36-4.44-.23-5.08-2.54-.19-.66-.5-4.74-.5-5.3,0-2.76,2.42-1.89,3.91-.77h0Z\"\/><\/svg><\/span><\/a><\/li><li class=\"shariff-button mailto shariff-nocustomcolor\" style=\"background-color:#a8a8a8\"><a href=\"mailto:?body=https%3A%2F%2Fwww.hiig.de%2Fen%2Finside-hugging-face%2F&subject=Inside%20Hugging%20Face\" title=\"Send by email\" aria-label=\"Send by email\" role=\"button\" rel=\"noopener nofollow\" class=\"shariff-link\" style=\"; background-color:#999; color:#fff\"><span class=\"shariff-icon\" style=\"\"><svg width=\"32px\" height=\"20px\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 32 32\"><path fill=\"#999\" d=\"M32 12.7v14.2q0 1.2-0.8 2t-2 0.9h-26.3q-1.2 0-2-0.9t-0.8-2v-14.2q0.8 0.9 1.8 1.6 6.5 4.4 8.9 6.1 1 0.8 1.6 1.2t1.7 0.9 2 0.4h0.1q0.9 0 2-0.4t1.7-0.9 1.6-1.2q3-2.2 8.9-6.1 1-0.7 1.8-1.6zM32 7.4q0 1.4-0.9 2.7t-2.2 2.2q-6.7 4.7-8.4 5.8-0.2 0.1-0.7 0.5t-1 0.7-0.9 0.6-1.1 0.5-0.9 0.2h-0.1q-0.4 0-0.9-0.2t-1.1-0.5-0.9-0.6-1-0.7-0.7-0.5q-1.6-1.1-4.7-3.2t-3.6-2.6q-1.1-0.7-2.1-2t-1-2.5q0-1.4 0.7-2.3t2.1-0.9h26.3q1.2 0 2 0.8t0.9 2z\"\/><\/svg><\/span><\/a><\/li><\/ul><\/div>","protected":false},"excerpt":{"rendered":"<p>Understanding what actors and organisations are on Hugging Face is crucial for understanding the current dynamics of open-source research in machine learning.\u00a0<\/p>\n","protected":false},"author":9999998,"featured_media":95489,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1289,1577,1582],"tags":[1612,1240,1613,861],"class_list":["post-95488","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-artificial-intelligence","category-digital-so","category-ftif-ai-and-society","tag-large-language-models","tag-machine-learning-en","tag-organisationen","tag-sustainability-2"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Inside Hugging Face &#8211; Digital Society Blog<\/title>\n<meta name=\"description\" content=\"Hugging Face is a platform that hosts machine learning models including Large Language Models (LLMs). What actors and organisations use it?\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.hiig.de\/en\/inside-hugging-face\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Inside Hugging Face &#8211; Digital Society Blog\" \/>\n<meta property=\"og:description\" content=\"Hugging Face is a platform that hosts machine learning models including Large Language Models (LLMs). What actors and organisations use it?\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.hiig.de\/en\/inside-hugging-face\/\" \/>\n<meta property=\"og:site_name\" content=\"HIIG\" \/>\n<meta property=\"article:published_time\" content=\"2023-07-27T13:07:12+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2024-01-31T14:35:42+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Blogpost-Huggingface-\u2013-4.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1920\" \/>\n\t<meta property=\"og:image:height\" content=\"1080\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Stefanie Barth\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Stefanie Barth\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"9 minutes\" \/>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Inside Hugging Face &#8211; Digital Society Blog","description":"Hugging Face is a platform that hosts machine learning models including Large Language Models (LLMs). What actors and organisations use it?","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/","og_locale":"en_US","og_type":"article","og_title":"Inside Hugging Face &#8211; Digital Society Blog","og_description":"Hugging Face is a platform that hosts machine learning models including Large Language Models (LLMs). What actors and organisations use it?","og_url":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/","og_site_name":"HIIG","article_published_time":"2023-07-27T13:07:12+00:00","article_modified_time":"2024-01-31T14:35:42+00:00","og_image":[{"width":1920,"height":1080,"url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Blogpost-Huggingface-\u2013-4.png","type":"image\/png"}],"author":"Stefanie Barth","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Stefanie Barth","Est. reading time":"9 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/#article","isPartOf":{"@id":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/"},"author":{"name":"Stefanie Barth","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/a07aa81c80d1dbd4ef1ab5c1cd9c10fd"},"headline":"Inside Hugging Face","datePublished":"2023-07-27T13:07:12+00:00","dateModified":"2024-01-31T14:35:42+00:00","mainEntityOfPage":{"@id":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/"},"wordCount":1378,"publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"image":{"@id":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Blogpost-Huggingface-\u2013-4.png","keywords":["large language models","Machine Learning","organisationen","sustainability"],"articleSection":["Artificial Intelligence","Digital Society Blog","ftif AI and Society"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/","url":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/","name":"Inside Hugging Face &#8211; Digital Society Blog","isPartOf":{"@id":"https:\/\/www.hiig.de\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/#primaryimage"},"image":{"@id":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/#primaryimage"},"thumbnailUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Blogpost-Huggingface-\u2013-4.png","datePublished":"2023-07-27T13:07:12+00:00","dateModified":"2024-01-31T14:35:42+00:00","description":"Hugging Face is a platform that hosts machine learning models including Large Language Models (LLMs). What actors and organisations use it?","breadcrumb":{"@id":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.hiig.de\/en\/inside-hugging-face\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/#primaryimage","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Blogpost-Huggingface-\u2013-4.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2023\/07\/Blogpost-Huggingface-\u2013-4.png","width":1920,"height":1080,"caption":"Analysing Hugging Face helps us to understand the dynamics of the machine learning hype"},{"@type":"BreadcrumbList","@id":"https:\/\/www.hiig.de\/en\/inside-hugging-face\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.hiig.de\/en\/"},{"@type":"ListItem","position":2,"name":"Inside Hugging Face"}]},{"@type":"WebSite","@id":"https:\/\/www.hiig.de\/#website","url":"https:\/\/www.hiig.de\/","name":"HIIG","description":"Alexander von Humboldt Institute for Internet and Society","publisher":{"@id":"https:\/\/www.hiig.de\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.hiig.de\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.hiig.de\/#organization","name":"HIIG","url":"https:\/\/www.hiig.de\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/","url":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","contentUrl":"https:\/\/www.hiig.de\/wp-content\/uploads\/2019\/06\/hiig.png","width":320,"height":80,"caption":"HIIG"},"image":{"@id":"https:\/\/www.hiig.de\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/www.hiig.de\/#\/schema\/person\/a07aa81c80d1dbd4ef1ab5c1cd9c10fd","name":"Stefanie Barth"}]}},"_links":{"self":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/95488","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/users\/9999998"}],"replies":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/comments?post=95488"}],"version-history":[{"count":6,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/95488\/revisions"}],"predecessor-version":[{"id":96431,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/posts\/95488\/revisions\/96431"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media\/95489"}],"wp:attachment":[{"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/media?parent=95488"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/categories?post=95488"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.hiig.de\/en\/wp-json\/wp\/v2\/tags?post=95488"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}