Qua

Reader

A list of all postings of all blogs on Qua where the author has configured the blog to publish on this page.

from clo.gnz

Le concept de Verre rempli à moitié ne parle ni de verre ni d'eau. Tout comme celui de Croissance négative ne parle pas de croissance, ou celui de Température ressentie ne parle pas de température.

... Ce sont les notes des nouvelles musiques, qui n'ont rien à voir avec la Musique.

 
Lire la suite...

from mattesilver

A set of workshop tools

I've been looking for a tool to manage development tools like linters, test runners, etc. The tools are typically development tools, so for example pipx would qualify.

A tool manager should make the environment reproducible and portable. The installed tools should work not only on my machine as well as on machine of everyone else who works on the same project, and they should not conflict with other tools anyone has installed on their systems.

Ideally the tool would have a declarative configuration, so it's clear what's used and easy to upgrade.

The problem with task managers (that aren't also tool managers) is that they don't control the tools, so a task that works on one system may fail on another. Worse, a task that works at one point in time may fail at another, because of a new tool version. So task manager should be either also a tool manager or it should be used with one.

On the other hand, managing tools is also a big subject, it's pretty much a package manager, needing some sort of a descriptor for a lot of tools in various languages, that need to be installable on various systems and architectures.

In short, the requirements boil down to:

  • declarative tool SBOM
  • tool isolation (but with access to project dependencies)
  • task management

Installing tools directly in the system

The old way, where the user installs tools following the project readme. Not declarative, not reproducible, definitely not isolated.

❌ declarative ❌ isolated ❌ task management

Declarative package managers – Nix, Guix

Environments are reproducible, declarative (both use functional languages). Rather difficult to use and very intrusive – you pretty much install another operating system in your operating system. I don't see any technical obstacle from making them single-user application, but they chose against it. Unless you already use one of them, it's a total overkill.

Maybe they work better with dev containers, but even then there are alternatives.

✅ declarative ✅ isolated ❌ task management

pipx

pip for executable packages. Supports installation of multiple versions of the same tool, but there's no way to activate the required versions for a given project. And it only works for python tools (and those with a python wrapper).

❌ declarative ❌ isolated (tools are isolated from each other, but shared across projects) ❌ task management

uvx, npx, pnpx, etc

Set of tools that let you run their respective languages projects as programs. Each works only with tools in its language, but it's not the biggest problem, since typically the tools are written in the same language, or have wrappers.

They arent really designed to manage a toolbox, rather for one-off execution. There's no declarative versioning.

uvx is a part of uv, which is a package manager, and also python version manager, but somehow tool manager is out of scope 🤷.

❌ declarative ✅ isolated ❌ task management

Tools as dev dependencies

It's really not what dev dependencies are meant for: everything is installed in a shared environment (venv, nodemodules). In the nodejs environment this should be fine, every package maintains its own dependencies, even if versions differ – that's why you end up with huge nodemodules. In python you might end up with conflicting dependencies (this somehow doesn't stop poetry project to use dev dependency group this way). Again, they only work for tools written in the same language as the project.

✅ declarative ❌ isolated ❓ task management (not in python, package.json has scripts)

pre-commit

Technically ticks many boxes, but not really intended as a dev tool manager.

Uses git repositories as a distributed source of packages, keeps them isolated and cached per project. Versions are fully declarative – you can even pin full SHA hashes.

This is all great, but it's literal git's pre-commit utility, which means it's designed for fast and predictable execution. It's not intended as a general purpose tool manager – it doesn't provide the tools with full project environment (it doesn't include dependencies), it doesn't support test runners (also because testing is too slow for a pre-commit hook) and even support for some linters (see pythons mypy) is suboptimal (again, mypy works best with access to dependencies).

✅ declarative ✅ isolated (actually too isolated for my purpose) ❌ task management

Mise

Works both as a package manager (tools) and task manager (replacement for package.json scripts, rather than for make). The only tool with the express goal of being a tool and task manager.

It doesn't handle tools with plugins well (e.g. pydantic.mypy) – it uses a shared directory for installs, but only tool name and version are used for tools “identity”. which means if you have two projects, both using mypy, one using pydantic with mypy plugin, there's a conflict (there's a workaround to this, but not good for collaboration).

It also does funny things to $PATH so it doesn't work great with other tools, like IDEs or pre-commit.

✅ declarative ❌ isolated (buggy) ✅ task management

Dev containers

This isn't even a tool, it's simply a container (docker, podman, you name it) that has your tools installed and has access to your project directory.

It's great for isolation and reproducibility, as long your installs are versioned. The isolation makes curl | sudo sh - a bit more acceptable.

I haven't worked with it, but the concept seems very reasonable, especially for occasional build of applications that you only use (but not develop).

Conclusion

I don't see here a perfect tool, and I'm not even looking for a polyglot multi-project / monorepo manager.

Mise comes very close, and I'm currently using it but it has a problem with isolation and it doesn't seem interested in fixing it, so I wouldn't recommend it on it's own in a collaborative environment.

I'll take a closer look at mise in a dev container. Dev containers would provide the isolation and mise covers declarative tool and task management quite well.

 
Read more...

from clo.gnz

On ne peut pas toujours esquiver Le Système. Il faut bien de temps en temps se décider à le confronter.

Ne serait-ce que pour confirmer qu'on n'est pas en train de confondre, la force émancipatrice du nouveau, avec la peur de mettre à nu ses faiblesses dans l'ancien.

 
Lire la suite...

from veer66

In the context of programming in the 1980s, “global variables” likely brings to mind languages like MBASIC. However, using MBASIC as an example today would be challenging, as it is now rarely used or known. Instead, GNU Bash, which is the default shell scripting language for many systems—will be used to illustrate what global variables were traditionally like. Anyway, some might think of Fortran II, but I'm not familiar with it.

Bash Example

I wrote a Bash script consisting of four files:

a.bash

a.bash orchestrates everything. “Orchestration” might be too grand a word for these toy scripts, but I want to convey that it performs a role similar to Apache Airflow.

#!/bin/bash

. ./b.bash
. ./c.bash
. ./d.bash

init
print_count
inc
print_count

b.bash

b.bash contains only one function, inc, which increments the counter, which is the global variable in this example.

inc() {
    counter=$((counter + 1))
}

c.bash

c.bash contains print_count, which simply displays the value of the global variable counter.

print_count() {
    echo $counter
}

d.bash

In Bash, counter can be initialized globally from within a function by default.

init() {
    counter=1
}

Python Example

The following section shows the result of porting the Bash script above to Python, highlighting the key differences.

a.py

This is the orchestration part. Note the use of namespaces or module names.

import c, b, d

d.init()
c.print_count()
b.inc()
c.print_count()

b.py

In the Python version, inc must refer to the module d to access the variable. Alternatively, counter could be explicitly imported.

import d

def inc():
    d.counter += 1

c.py

Similarly, print_count in Python must also refer to module d.

import d

def print_count():
    print(d.counter)

d.py

Unlike in Bash, initializing a global variable from within a function—even in the same module—requires an explicit global declaration.

def init():
    global counter
    counter = 1

Key Differences

As you can see, a global variable in Bash is truly global across files. In Python, however, a global variable is only global within its module, i.e., file. Furthermore, mutating a global variable in Bash requires no special syntax, whereas in Python a function must explicitly declare global to modify a module-level variable.

Consequently, although both are called “global variables,” Python's are scoped to the module. This means they won’t interfere with variables in other modules unless we deliberately make them do so. For developers who use one class per module, a Python global variable behaves much like a class variable. Additionally, variable assignment inside a Python function is local by default, preventing accidental modification of global state unless explicitly intended.

In short, many traditional precautions about global variables in languages like Bash or MBASIC no longer apply in Python. Therefore, we might reconsider automatically rejecting global variables based on past advice and instead evaluate their use case thoughtfully.

 
Read more...

from veer66

I was impressed by pkgsrc because it is a powerful package management system that defines packages using a common tool like a Makefile. I later discovered that the PKGBUILD file is even more impressive, as its purpose is immediately clear. I initially attributed this to my greater familiarity with Bash scripting compared to Makefiles, but I now believe the true reason is PKGBUILD's level of abstraction.

It retains explicit calls to configure and make, preserving the transparency of a manual installation. This demonstrates that while increased abstraction can make code shorter, it can also hinder understanding.

Another potential advantage of greater abstraction is the ability to change the configure command for every package by modifying just one location. However, since GNU Autotools has continued to use the configure command for decades, it may not be worth sacrificing clarity for this particular benefit.

 
Read more...

from veer66

(Posted at dev.to on 2025-09-02)

Expanding Emacs functionality is as simple as defining a new function instead of creating an entire extension package, as is often done in many other extensible editors. This function can then be re-evaluated, tested, and modified entirely within Emacs using just a few clicks or keyboard shortcuts, with no need to restart or reload Emacs.

 
Read more...

from veer66

(Posted at dev.to on 2025-08-17)

An actively-maintained-implementation, long-term-stable-specification programming language

There are many programming languages that don't change much, including Common Lisp, but Common Lisp implementations continue to be developed. For example, SBCL (Steel Bank Common Lisp) released its latest version just last month.

Common Lisp can be extended through libraries. For example, cl-interpol enables Perl-style strings to Common Lisp without requiring a new version of Common Lisp. cl-arrows allows Common Lisp to create pipelines using Clojure-style syntax without needing to update the Common Lisp specification. This exceptional extensibility stems from macro and particularly reader macro support in Common Lisp.

Feature-packed

Common Lisp includes many features found in modern programming languages, such as:

  • Garbage collection
  • Built-in data structures (e.g., vectors, hash tables)
  • Type hints
  • Class definitions
  • A syntactic structure similar to list comprehensions

Multi-paradigm

While Lisp is commonly associated with functional programming, Common Lisp doesn't enforce this paradigm. It fully supports imperative programming (like Pascal), and its object-oriented programming system even includes advanced features. Best of all, you can freely mix all these styles. Common Lisp even embraces goto-like code via TAGBODY-GO.

Performance

Common Lisp has many implementations, and some of them, such as SBCL, are compilers that can generate efficient code.

With some (of course, not all) implementations, many programs written in dynamic programming languages run slower than those in static ones, such as C and Modula-2.

First, an example of the generated assembly will be shown, along with more explanation about why it might be slowed down by some dynamic implementations

The code listing below is a part of a program written in Modula-2, which must be easy to read by programmers of languages in the extended ALGOL family.

    TYPE
      Book = RECORD
        title: ARRAY[1..64] OF CHAR;
        price: REAL;
      END;

    PROCEDURE SumPrice(a, b: Book): REAL;
    BEGIN
      RETURN a.price + b.price;
    END SumPrice;

The code is mainly for summing the price of books, and only the part 'a.price + b.price' will be focused on.

'a.price + b.price' is translated into X86-64 assembly code list below using the GNU Modula-2 compiler.

    movsd   80(%rbp), %xmm1
    movsd   152(%rbp), %xmm0
    addsd   %xmm1, %xmm0

“movsd 80(%rbp), %xmm1' and 'movsd 152(%rbp), %xmm0' are for loading 'prices' to registers '%xmm1' and '%xmm0', respectively. Finally, 'addsd %xmm1, %xmm0' is for adding prices together. As can be seen, the prices are loaded from exact locations relative to the value of the '%rbp' register, which is one of the most efficient ways to load data from memory. The instruction 'addsd' is used because prices in this program are REAL (floating point numbers), and '%xmm0', '%xmm1', and 'movsd' are used for the same reason. This generated code should be reasonably efficient. However, the compiler needs to know the type and location of the prices beforehand to choose the proper instructions and registers to use.

In dynamic languages, 'SumPrice' can be applied to a price whose type is an INTEGER instead of a REAL, or it can even be a string/text. A straightforward implementation would check the type of 'a' and 'b' at runtime, which makes the program much less efficient. The checking and especially branching can cost more time than adding the numbers themselves. Moreover, obtaining the value of the price attribute from 'a' and 'b' might be done by accessing a hash table instead of directly loading the value from memory. Of course, while a hash-table has many advantages, it's less efficient because it requires many steps, including comparing the attribute name and generating a hash value.

However, compilers for dynamic languages can be much more advanced than what's mentioned above, and SBCL is one such advanced compiler. SBCL can infer types from the code, especially from literals. Moreover, with information from type hints and 'struct' usage, SBCL can generate code that's comparably as efficient as static language compilers.

Given, the Common Lisp code listing below:

    (defstruct book
      title
      (price 0 :type double-float))

    (declaim (ftype (function (book book) double-float) add-price)
         (optimize (speed 3) (debug 0) (safety 0)))
    (defun add-price (a b)
      (+ (book-price a)
         (book-price b)))

SBCL can generate assembly code for '(+ (book-price a) (book-price b))' as shown below:

    ; 86:       F20F104A0D       MOVSD XMM1, [RDX+13]
    ; 8B:       F20F10570D       MOVSD XMM2, [RDI+13]
    ; 90:       F20F58D1         ADDSD XMM2, XMM1

The assembly code format is slightly different from the one generated by the GNU Modula-2 compiler, but the main parts, the 'MOVSD' and 'ADDSD' instructions and the use of XMM registers—are exactly the same. This shows that we can write efficient code in Common Lisp at least for this case. This shows that we can write efficient code in Common Lisp, at least in this case, that is as efficient as, or nearly as efficient as, a static language.

This implies that Common Lisp is good both for high-level rapid development and optimized code, which has two advantages: (1) in many cases, there is no need to switch between two languages, i.e., a high-level one and a fast one; (2) the code can be started from high-level and optimized in the same code after a profiler finds critical parts. This paradigm can prevent premature optimization.

Interactive programming

Interactive programming may not sound familiar. However, it is a common technique that has been used for decades. For example, a database engine such as PostgreSQL doesn't need to be stopped and restarted just to run a new SQL statement. Similarly, it is akin to a spreadsheet like Lotus 1-2-3 or Microsoft Excel, which can run a new formula without needing to reload existing sheets or restart the program.

Common Lisp is exceptionally well-suited for interactive programming because of (1) integrated editors with a REPL (Read Eval Print Loop), (2) the language's syntax, and (3) the active community that has developed libraries specifically designed to support interactive programming.

Integrated editors with a REPL

With an integrated with a REPL, any part of the code can be evaluated immediately without copying and pasting from an editor into a REPL. This workflow provides feedback even faster than hot reloading because the code can be evaluated and its results seen instantaneously, even before it is saved. There are many supported editors, such as Visual Studio Code, Emacs, Neovim, and others.

the language's syntax

Instead of marking region arbitrarily for evaluating, which is not very convenient when it is done every few seconds, in Common Lisp, we can mark a form (which is similar to a block in ALGOL) by moving a cursor to one of the parentheses in the code, which is very easy with structural editing, which will be discussed in the next section.

Moreover, even a method definition can be evaluated immediately without resetting the state of the object in Common Lisp. Since method definitions are not nested in defclass, this allows mixing interactive programming and object-oriented programming (OOP) smoothly.

Here's the corrected code listing:

    (defclass toto ()
      ((i :initarg :i :accesor i)))

    (defmethod update-i ((obj toto))
      (setf (i obj) (+ i obj) 1))

According to the code listing above, the method 'update-i' can be redefined without interfering with the pre-existing value of 'i'.

Structural editing

Instead of editing Lisp code like normal text, tree-based operations can be used instead, such as paredit-join-sexps and paredit-forward-slurp-sexp. Moving cursor operations, such as paredit-forward, which moves the cursor to the end of the form (a block). These structural moving operations are also useful for selecting regions to be evaluated in a REPL.

Conclusion

In brief, Common Lisp has unparalleled combined advantages, which are relevant to software development especially now, not just an archaic technology that just came earlier. For example, Forth has a long-term-stable specification, and works well with interactive programming, but it is not designed for defining classes and adding type hints. Julia has similar performance optimization and OOP is even richer, but it doesn't have a long-term-stable specification. Moreover, Common Lisp's community is still active, as libraries, apps, and even implementations continue to receive updates.

 
Read more...

from veer66

(Published on dev.to on 2025-07-19)

No one actually cares about my programming environment journey, but I’ve often been asked to share it, perhaps for the sake of social media algorithms. I post it here, so later, I can copy and paste this conveniently.

My first computer, in the sense that I, not someone else, made the decision to buy it, ran Debian in 2002. It was a used Compaq desktop with a Pentium II processor, which I bought from Zeer Rangsit, a used computer market that may be the most famous in Thailand these days. When I got it home, I installed Debian right away. Before I bought my computer, I had used MBasic, mainly MS-DOS, Windows 3.1 (though rarely), and Solaris (remotely). For experimentation, I used Xenix, AIX, and one on DEC PDP-11 that I forgot.

Since I started with MBasic, that was my first programming environment. I learned Logo at a summer camp, so that became my second. Later, my father bought me a copy of Turbo Basic, and at school, I switched to Turbo Pascal.

After moving to GNU/Linux, I used more editors instead IDEs. From 1995 to 2010, my editors were pico, nvi, vim, TextMate, and Emacs paired with GCC (mostly C, not C++), PHP, Perl, Ruby, Python, JavaScript, and SQL. I also used VisualAge to learn Java in the 90s. I tried Haskell, OCaml, Objective C, Lua, Julia, and Scala too, but it was strictly for learning only.

After 2010, I used IntelliJ IDEA and Eclipse for Java and Kotlin. For Rust (instead of C), I used Emacs and Visual Studio Code. I explored Racket for learning purposes, then later started coding seriously in Clojure and Common Lisp. I tried using Vim 9.x and Neovim too, they were great, but not quite my cup of tea.

In 2025, a few days ago, I learned Smalltalk with Pharo to deepen my understanding of OOP and exploratory programming.

Update 2025/07/20: I forgot to mention xBase. In the '90s, I used it in a programming competition, but none of my programs in xBase reach production.

 
Read more...

from mattesilver

I'm developing a small package for writing typed Web API clients, similar to Uplink, only compatible with OpenAPI (lapidary). I use these clients mostly in Web servers, so I decided to make it async. So far I've settled on HTTPX, but I want to keep tabs on available python packages.

urllib.request

See also The Requests package is recommended for a higher-level HTTP client interface.

It accepts request headers only as a dict, and I've decided that it doesn't count as support for HTTP

❌ HTTP/1.1 ❌ HTTP/2 ❌ HTTP/3 ❌ asyncio

urllib3

Fast and robust, slowly acquiring HTTP/2 support, but only sync.

✅ HTTP/1.1 ❌ HTTP/2 ❌ HTTP/3 ❌ asyncio

requests

This the one every python example tells you to use. It's under a feature freeze since forever – HTTP moves on but requests is as perfect as possible. It also “broke the internet” when it moved to urllib3 2.0.0, which dropped support for older TLS algorithms, in a minor version release. As long as the API doesn't change in an incompatible way, right?

✅ HTTP/1.1 ❌ HTTP/2 ❌ HTTP/3 ❌ asyncio

pycURL

It's super fast, since it's super thin wrapper over libcURL.

Also a great showcase of how not to write python wrappers. Might as well use ctypes.

✅ HTTP/1.1 ✅ HTTP/2 ✅ HTTP/3 (I think it's supported, libcURL says it's experimental) ❌ asyncio but non-blocking, since it's a native code.

HTTPX

The most popular async HTTP client package for python. Supports HTTP/2 but claims it's “not as robust” as 1.1. Both sync and async.

API is mostly compatible with requests (it's not a feature). Depending on usage pattern, it's slightly slower or much slower than aiohttp.

✅ HTTP/1.1 ✅ HTTP/2 (discouraged, opt-in) ❌ HTTP/3 ✅ sync & async

aiohttp

API is close to the protocol, so a bit more complex than alternatives, but I like it. One of the fastest packages in python.

The community seems more focused on the server side (more features and plugins).

✅ HTTP/1.1 ❌ HTTP/2 ❌ HTTP/3 ✅ asyncio-only

Aiosonic

Looks and feels like HTTPX.

Depending on usage pattern, it's either a bit slower or a bit faster than aiohttp. Probably there's something wrong with it, because nobody seems to be using it. Definitely worth a closer look.

✅ HTTP/1.1 ✅ HTTP/2 – beta ❌ HTTP/3 ✅ sync & asyncio

Aioquic

Really low level HTTP/3 & QUIC package. Only for genuine H3 users. Could use better examples – example client has 600+ lines. Or maybe it's that hard to use...

❌ HTTP/1.1 ❌ HTTP/2 ✅ HTTP/3 ✅ asyncio-only

Niquests

Async fork of requests with HTTP/2 and 3. Fast, full featured but not user-friendly.

It uses a forked urllib3 (by the same author) that shadows it, so if you happen to have a transitive dependency on the original, it will mess up your environment. The author is fine with it. I recommend against using it.

✅ HTTP/1.1 ✅ HTTP/2 ✅ HTTP/3 ✅ sync & asyncio

Reqwest wrappers

This makes a lot of sense – Rust is a popular language, reqwest is a popular rust package, keep it DRY. None of the wrappers are widely popular, but some projects are live and definitely worth watching. – gufo-http – I think it's the oldest and still maintained – pyreqwesthttpr

Found a few more, just search reqwest in pypi.

✅ HTTP/1.1 ✅ HTTP/2 ❌ HTTP/3 – maybe, it's still experimental in reqwest ✅ sync and asyncio

Conclusion

Honestly, the clients for the protocol that makes the Web makes me wonder is Python still alive? There isn't a single popular and stable package with HTTP/2 (2015!) or HTTP/3 (2022) either sync or async. Is Python only for Django and LLM? Have everyone else switch to Rust and/or Deno?

 
Read more...

from mattesilver

Based on the Litestar Fullstack Inertia showcase project.

Prerequisites

For social authorization to work, you first need to register your application with the identity provider (e.g., GitHub).

Implementation

Frontend

You need a login or registration page with a button/link that sends the browser to a backend route which starts the OAuth2 flow. For social login, it doesn’t matter whether you call it “login” or “registration”, as the first login typically creates the user.

Implementation details: – In the showcase project it’s implemented in partials: – Login formRegistration form – Both partials use the same button, which handles clicks with router.post(route("github.register")). – route() is an Inertia.js helper function which simply resolves handler names to paths. – A plain link and HTTP GET would also work here, since no data is sent at this stage.

Login/Register handler (backend)

Create a route that redirects to the provider’s authorization URL and includes a callback pointing to your completion handler. For production you should generate OAuth2 state and PKCE code verifier, and store them in the session.

Implementation details: – See the authorization controller – You can use httpx_oauth to get auth-related URLs for popular OAuth2 providers.

Authorization callback handler (backend)

Your callback (named “github.complete” in the showcase) should:

  1. Validate the store OAuth2 state (required for security; missing in the showcase)
  2. Exchange the authorization code for an access token
  3. Create a user object in the local database and store the user Id in the session
  4. Redirect to a post-login page (e.g., dashboard)

Optionally: – Fetch user details (e.g., email) from the provider
– Provide user data to the frontend

Implementation details: – The showcase doesn't use OAuth2 state or PKCE – GitHubOAuth2.get_access_token is used to exchange the auth code for a token (wrapped as a Litestar dependency, to facilitate re-use) – GitHubOAuth2.get_id_email is used to retrieve users email – Inertia’s share() is used to pass user details to the frontend

© 2025 Matte Silver

 
Read more...

from Rolistologie

Dans le marasme des magouilles des macronistes et de leurs pièges tendus aux idiots utiles, réfugions-nous dans le monde imaginaire d’#INSMV (version #INSMVadlib), avec :

L’influence des Archanges et Princes Démons sur les principaux groupes de l’Assemblée Nationale française en 2025.

En première ligne et en gras, une influence plutôt active, en deuxième ligne une implication de type soutien. Je n’ai pas précisé le niveau d’investissement, ni les raisons, afin de maintenir les secrets vis à vis des joueurs, mais si des MJs veulent en discuter, c’est ouvert. Évidemment, certains acteurs sont impliqués dans plusieurs groupes politiques, mais pour des raisons de simplification et de clarté, je ne les ai positionnés que sur le groupe principal.

PCF : Haagenti, Kobal 

LFI : Alain, DajjâlGuy, Furfur, Hassan, Jordi  Catherine 

Écolos : Novalis   Eli

PS : Malphas  Jean-Luc, Blandine 

Modem : Christophe  Ange

LREM : JanusMammon, UphirValefor Andromalius, Malthus, Shaytan, Vapula

LR : Marc  Caym

RN : Belial, Dominique, GeorgesLaurent, Nybbas  Joseph 

#JdR

 
Lire la suite...

from Jéssica Maryellen

Jogo da memória - Números - Nível 3

Tema: Números e Quantidades – Maternal II

Este jogo da memória foi especialmente desenvolvido para os alunos do Maternal II, com o objetivo de auxiliar no processo de reconhecimento dos números de 1 ao 9 e na associação entre número e quantidade.

O jogo é composto por seis pares de cartas, com cada par sendo formado por:

  • Uma carta com o número escrito.
  • Uma carta com a quantidade correspondente em figuras.

De forma lúdica e interativa, as crianças são incentivadas a observar, contar e associar, desenvolvendo habilidades como:

  • Reconhecimento visual dos números.
  • Noção de quantidade.
  • Atenção e memória.
Nível 1 -:-:-:– Nível 2 -:-:-:– Nível 3
 
Leia mais...

from Jéssica Maryellen

Jogo da memória - Números - Nível 2

Tema: Números e Quantidades – Maternal II

Este jogo da memória foi especialmente desenvolvido para os alunos do Maternal II, com o objetivo de auxiliar no processo de reconhecimento dos números de 1 ao 9 e na associação entre número e quantidade.

O jogo é composto por cinco pares de cartas, com cada par sendo formado por:

  • Uma carta com o número escrito.
  • Uma carta com a quantidade correspondente em figuras.

De forma lúdica e interativa, as crianças são incentivadas a observar, contar e associar, desenvolvendo habilidades como:

  • Reconhecimento visual dos números.
  • Noção de quantidade.
  • Atenção e memória.
Nível 1 -:-:-:– Nível 2 -:-:-:– Nível 3
 
Leia mais...

from Jéssica Maryellen

Jogo da memória - Números - Nível 1

Tema: Números e Quantidades – Maternal II

Este jogo da memória foi especialmente desenvolvido para os alunos do Maternal II, com o objetivo de auxiliar no processo de reconhecimento dos números de 1 ao 9 e na associação entre número e quantidade.

O jogo é composto por quatro pares de cartas, com cada par sendo formado por:

  • Uma carta com o número escrito.
  • Uma carta com a quantidade correspondente em figuras.

De forma lúdica e interativa, as crianças são incentivadas a observar, contar e associar, desenvolvendo habilidades como:

  • Reconhecimento visual dos números.
  • Noção de quantidade.
  • Atenção e memória.
Nível 1 -:-:-:– Nível 2 -:-:-:– Nível 3
 
Leia mais...

from Rolistologie

Si vous avez compris ce titre, rendez-vous dans un prochain billet pour en parler. Aujourd'hui, je vais simplement poser des définitions, pour être sûr que nous communiquerons clairement en utilisant les mêmes termes. En effet, j'ai déjà passer des heures à débattre avec d'autres rôlistes alors qu'on était à peu près d'accord, mais obscurcis par l'absence de vocabulaire commun.

Voici donc mes propositions de définitions, principalement basées sur la littérature commune, et clarifiées lorsque certaines précisions étaient nécessaires :

Système de jeu : ensemble des mécanismes non diégétiques pour modéliser l'univers du jeu.

Système de Résolution d'Action (SRA) : mécanisme pour résoudre les actions (ou les conflits). On peut classer les SRA en 3 catégories : Alea, Karma et Drama. Le SRA est évidemment inclu dans le système de jeu.

Alea : SRA ayant recours au hasard : dés, cartes, tirages…

Karma : SRA utilisant des valeurs fixes, sans hasard : comparaison de valeurs, dépense de points…

Drama : SRA en se basant sur l'histoire : choix selon la cohérence au canon du jeu, selon l'intérêt du scenario…

Système de joueurs : ensemble de règles déterminant le rôle des joueurs, et la répartition de l'autorité. Exemple de système de joueur, dans le “jdr traditionnel” : un joueur est MJ avec toute l'autorité, les autres jouent les PJ.

Immersion : Ah, là ça mérite débat, car il y a plusieurs notions bien différentes sous ce même terme. À la fois la concentration dans la partie, que le fait de jouer un rôle, ce que les théoriciens scandinaves appellent eläytyminen et qui est utilisé dans le modèle des immersion multiples. Pour certains, ils s'agirait même de la sensation de vivre la fiction, alors que les frontières entre réalité et imaginaire s'estompent. Je détaille ça dans un billet dédié à l'immersion.

Autorité : Dans le contexte du jeu de rôle, l'autorité est le droit de décider, d'acter, ce qui est vrai dans la fiction. L'autorité peut être détenue par une seule personne qui peut la déléguer à sa guise (comme le MJ dans le système de joueurs classique), peut tourner, ou être partagée entre plusieurs joueurs.

#JdR #Théorie

 
Lire la suite...

from تنها

دورترین خاطره‌ای که به سختی صحنه‌ای تار از آن را به یاد می‌آورم، برمی‌گردد به زمانی که طفلی خردسال بودم. از آنجایی که تنها لحظه‌ای بسیار کوتاه از آن #خاطره را به یاد می‌آورم، نمی‌توانم خیلی با جزییات آن را شرح دهم. بنا بر‌ این هرآنچه ذهنم یاری کند را بصورت کوتاه بیان خواهم کرد.

به یاد می‌آورم که در ایوان خانه بر روی یک تاب نشسته بودم. ایوان و خانه سبک و سیاقی مانند خانه‌های اروپایی داشت و احتمالا از مصالح سبکی مثل چوب ساخته شده بود. جاده و زمین اطراف خانه خاکی بود و تا آنجایی که به خاطر دارم فضای روبروی خانه باز بود و سازه‌دیگری به چشم نمی‌خورد. لحظه‌ای بر‌گشتم و مادرم را که درحال انجام کارهای خانه بود تماشا کردم. در تمام این مدت آهنگی با صدای یک خواننده زن از خانه شنیده می‌شد که نمی‌توانم تشخیص دهم چه آهنگی بود یا خواننده آن چه کسی بود. اما هربار که سعی می‌کنم آن را به خاطر آورم، صدایی شبیه به گوگوش یا مرجان در ذهنم طنین انداز می‌شود.

هنگامی که این خاطره را برای مادرم تعریف کردم، آن مکان را کاملا می‌شناخت. آن خانه در محله بندرگاه بوشهر و در شهرک کارمندان خارجی نیروگاه اتمی بوشهر قرار داشت که به علت شعل پدرم، ما از امتیاز سکونت در آن مکان برخوردار شده بودیم. مادرم گفت که آن موقع من حدوداً ۲ ساله بوده ام. حیف که دیگر پدر و مادرم درقید حیات نیستند و نمی‌توانم اطلاعات بیشتری از آنچه که هست درمورد آن روزگار از ایشان دریافت کنم.

 
بیشتر بخوانید...