Context: Gemma is a group of free-to-use AI models with a focus on being small. According to benchmarks this outperforms Llama 3.

        • Eager Eagle@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          4 months ago

          neural network weights are just files, collections of numbers forming matrices; how is a partially open collection of weights of any use

          the weights are open

          $ docker exec -it ollama ollama show gemma:7b
            Model                              
            	arch            	gemma	             
            	parameters      	9B   	             
            	quantization    	Q4_0 	             
            	context length  	8192 	             
            	embedding length	3072 	             
            	                                  
            Parameters                         
            	stop            	"<start_of_turn>"	 
            	stop            	"<end_of_turn>"  	 
            	penalize_newline	false            	 
            	repeat_penalty  	1                	 
            	                                  
            License                            
            	Gemma Terms of Use              	  
            	Last modified: February 21, 2024	
          
          • jacksilver@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            4 months ago

            Since there is a user acceptance policy that restricts what you can do with the model that might be considered “partially” open.

            Yeah you can see the weights, but it seems you are limited on what you can do with the weights. How we’ve gotten to the point you can protect these random numbers that I’ve shared with you through a UA is beyond me.

  • BaroqueInMind@lemmy.one
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    I’m looking at the HuggingFace leader boards and Gemma isn’t even top 50. How does it stack up against Llama3 or Mistral-Open-Orca?