feat: new articles
This commit is contained in:
parent
c5db049494
commit
a66fc37c03
@ -1,10 +1,8 @@
|
|||||||
{
|
{
|
||||||
"rules": {
|
"rules": {
|
||||||
"max-ten": true,
|
|
||||||
"no-start-duplicated-conjunction": {
|
"no-start-duplicated-conjunction": {
|
||||||
"interval": 2
|
"interval": 2
|
||||||
},
|
},
|
||||||
"no-dropping-the-ra": true,
|
|
||||||
"common-misspellings": true,
|
"common-misspellings": true,
|
||||||
"preset-japanese": {
|
"preset-japanese": {
|
||||||
"sentence-length": false
|
"sentence-length": false
|
||||||
|
14
package.json
14
package.json
@ -5,7 +5,8 @@
|
|||||||
"build": "hexo generate",
|
"build": "hexo generate",
|
||||||
"clean": "hexo clean",
|
"clean": "hexo clean",
|
||||||
"deploy": "hexo deploy",
|
"deploy": "hexo deploy",
|
||||||
"start": "yarn clean && hexo server --debug"
|
"start": "yarn clean && hexo server --debug",
|
||||||
|
"test": "lint-staged"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"hexo": "^5.0.0",
|
"hexo": "^5.0.0",
|
||||||
@ -23,6 +24,17 @@
|
|||||||
"hexo-server": "^2.0.0",
|
"hexo-server": "^2.0.0",
|
||||||
"hexo-theme-landscape": "^0.0.3"
|
"hexo-theme-landscape": "^0.0.3"
|
||||||
},
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"lint-staged": "^10.5.4",
|
||||||
|
"textlint": "^11.8.2",
|
||||||
|
"textlint-filter-rule-whitelist": "^2.0.0",
|
||||||
|
"textlint-rule-common-misspellings": "^1.0.1",
|
||||||
|
"textlint-rule-no-start-duplicated-conjunction": "^2.0.2",
|
||||||
|
"textlint-rule-preset-japanese": "^5.0.0"
|
||||||
|
},
|
||||||
|
"lint-staged": {
|
||||||
|
"*.md": "textlint"
|
||||||
|
},
|
||||||
"hexo": {
|
"hexo": {
|
||||||
"version": "5.3.0"
|
"version": "5.3.0"
|
||||||
},
|
},
|
||||||
|
@ -1,45 +0,0 @@
|
|||||||
---
|
|
||||||
title: Extract Thumbnail Image from Affinity Photo and Affinity Design
|
|
||||||
---
|
|
||||||
|
|
||||||
Nextcloud doesn't have a support for thumbnail generation from Affinity Photo and Affinity Design. So I had to do it myself.
|
|
||||||
|
|
||||||
# Digging Binary
|
|
||||||
|
|
||||||
Glancing at `.afphoto` and `.afdesign` in Finder, I noticed that it has a QuickLook support and an ability to show the thumbnail image. So these files should have thumbnail image somewhere inside its binary.
|
|
||||||
|
|
||||||
I wrote a simple script to seek for thumbnail image from a binary and save it as `.png` file.
|
|
||||||
|
|
||||||
```js af.js
|
|
||||||
const fs = require("fs");
|
|
||||||
|
|
||||||
// png spec: https://www.w3.org/TR/PNG/
|
|
||||||
const PNG_SIG = Buffer.from([137, 80, 78, 71, 13, 10, 26, 10]);
|
|
||||||
const IEND_SIG = Buffer.from([73, 69, 78, 68]);
|
|
||||||
|
|
||||||
function extractThumbnail(buf) {
|
|
||||||
const start = buf.indexOf(PNG_SIG);
|
|
||||||
const end = buf.indexOf(IEND_SIG, start) + IEND_SIG.length * 2; // IEND + CRC
|
|
||||||
return buf.subarray(start, end);
|
|
||||||
}
|
|
||||||
|
|
||||||
function generateThumbnail(input, output) {
|
|
||||||
const buf = fs.readFileSync(input);
|
|
||||||
const thumbBuf = extractThumbnail(buf);
|
|
||||||
fs.writeFileSync(output, thumbBuf);
|
|
||||||
}
|
|
||||||
|
|
||||||
generateThumbnail(process.argv[2], process.argv[3] || "output.png");
|
|
||||||
```
|
|
||||||
|
|
||||||
That's right. This script just scrapes a binary file and extracts the portion of which starts with `PNG` signature and ends with `IEND`.
|
|
||||||
|
|
||||||
Now I can generate a thumbnail image from arbitrary `.afphoto` and `.afdesign` file. Let's move on delving into Nextcloud source code.
|
|
||||||
|
|
||||||
# Tweaking Nextcloud
|
|
||||||
|
|
||||||
I have a little experience in tweaking Nextcloud source code before, where I implemented thumbnail generator for PDFs, so it should be easier this time, hopefully.
|
|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
Anyway, long story short, I got Nextcloud generates thumbnail images for Affinity files by implementing PreviewGenerator class.
|
|
@ -1,36 +0,0 @@
|
|||||||
---
|
|
||||||
title: Build Chromium from zero
|
|
||||||
---
|
|
||||||
|
|
||||||
事前準備する。
|
|
||||||
|
|
||||||
```
|
|
||||||
brew install cache
|
|
||||||
git config --global core.precomposeUnicode true
|
|
||||||
```
|
|
||||||
|
|
||||||
ソースコードを手に入れる。
|
|
||||||
|
|
||||||
```shell
|
|
||||||
ghq get https://chromium.googlesource.com/chromium/tools/depot_tools.git
|
|
||||||
cd `ghq root`/chromium.googlesource.com/chromium
|
|
||||||
fetch chromium
|
|
||||||
```
|
|
||||||
|
|
||||||
`.envrc` に以下を追加し、`direnv allow`で環境変数を適用する。
|
|
||||||
|
|
||||||
```shell
|
|
||||||
PATH_add `qhq root`/chromium.googlesource.com/chromium/tools/depot_tools
|
|
||||||
PATH_add src/third_party/llvm-build/Release+Asserts/bin
|
|
||||||
export CCACHE_CPP2=yes
|
|
||||||
export CCACHE_SLOPPINESS=time_macros
|
|
||||||
export SPACESHIP_GIT_SHOW=false
|
|
||||||
```
|
|
||||||
|
|
||||||
ビルドする。
|
|
||||||
|
|
||||||
```shell
|
|
||||||
cd src
|
|
||||||
gn gen out/Default --args='cc_wrapper="ccache"'
|
|
||||||
autoninja -C out/Default chrome
|
|
||||||
```
|
|
@ -1,23 +0,0 @@
|
|||||||
---
|
|
||||||
title: 深圳を旅する Tips
|
|
||||||
---
|
|
||||||
|
|
||||||
## WeChat Pay
|
|
||||||
|
|
||||||
深圳での食事は殆ど WeChat Pay で支払うことが出来た。
|
|
||||||
|
|
||||||
## UnionPay
|
|
||||||
|
|
||||||
ホテルやスターバックスで Visa カードを使うことが出来なかったため、UnionPay カードで支払うことになった。
|
|
||||||
|
|
||||||
## Pocketchange
|
|
||||||
|
|
||||||
日本の空港に設置している pocketchange で外貨や日本円を各種電子マネーに両替することが出来る。
|
|
||||||
|
|
||||||
## Google Translate
|
|
||||||
|
|
||||||
翻訳データはダウンロードしてローカルで使えるようにしておくこと。
|
|
||||||
|
|
||||||
## Octopus / 深圳通
|
|
||||||
|
|
||||||
KKday で事前予約していた物を香港空港で受け取った。深圳通は現地で購入した。
|
|
@ -1,12 +0,0 @@
|
|||||||
---
|
|
||||||
title: Deconvolutionと呼ぶのはもうやめよう
|
|
||||||
date: 2017-03-05 13:44:00 +09:00
|
|
||||||
---
|
|
||||||
|
|
||||||
深層学習において、Convolutional Layer (畳み込み層)とは、あるシェイプのテンソルをそれ以下のサイズに縮約する性質のレイヤーです。一方で Deconvolution Layer (逆畳み込み層)とは、[Jonathan Long, et al](https://arxiv.org/abs/1411.4038)の論文で提案されたレイヤーで、あるシェイプのテンソルをそれ以上のサイズに拡大する性質を持ちます。
|
|
||||||
|
|
||||||
ところが実際のところ、このレイヤーは Transposed Convolution Layer (転置畳み込み層)と呼ぶべきです。なぜかを以下に示します。
|
|
||||||
|
|
||||||
> Upsampling is backwards strided convolution. (アップサンプリングは
|
|
||||||
|
|
||||||
[Stack Exchange](http://datascience.stackexchange.com/questions/6107/what-are-deconvolutional-layers)での議論を踏まえると
|
|
@ -1,30 +0,0 @@
|
|||||||
---
|
|
||||||
title: Developing Web Apps in One Minutes
|
|
||||||
---
|
|
||||||
|
|
||||||
## 0. Setup Homebrew and Node
|
|
||||||
|
|
||||||
```
|
|
||||||
/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
|
|
||||||
```
|
|
||||||
|
|
||||||
```
|
|
||||||
brew install node
|
|
||||||
```
|
|
||||||
|
|
||||||
## 1. Scaffold from template
|
|
||||||
|
|
||||||
```
|
|
||||||
npx express-generator --view=pug awesome-app
|
|
||||||
cd awesome-app
|
|
||||||
npm install
|
|
||||||
npm start
|
|
||||||
```
|
|
||||||
|
|
||||||
## 2. Deploy with Now
|
|
||||||
|
|
||||||
```
|
|
||||||
npm install -g now
|
|
||||||
now login
|
|
||||||
now --public
|
|
||||||
```
|
|
@ -1,11 +1,11 @@
|
|||||||
---
|
---
|
||||||
title: 'gst: a powerful pal for ghq'
|
title: "gst: a powerful pal for ghq"
|
||||||
date: 2017-06-02 23:02:00 +09:00
|
date: 2017-06-02 23:02:00 +09:00
|
||||||
---
|
---
|
||||||
|
|
||||||
[gst](https://github.com/uetchy/gst) is tiny and simple but powerful pal for [ghq](https://github.com/motemen/ghq).
|
[gst](https://github.com/uetchy/gst) is tiny and simple but powerful pal for [ghq](https://github.com/motemen/ghq).
|
||||||
|
|
||||||
Have you ever imagined what if you know which commits are unpushed or which changes are uncommitted yet, for all of the repositories you have cloned on your machine?
|
Have you ever imagined what if you know which commits are unpushed or which changes are uncommitted yet, for all the repositories you have cloned on your machine?
|
||||||
|
|
||||||
You might want to check out my ongoing project `gst`:
|
You might want to check out my ongoing project `gst`:
|
||||||
it might help you to know what ongoing changes are remained to be committed or pushed among entire your local repositories.
|
it might help you to know what ongoing changes are remained to be committed or pushed among entire your local repositories.
|
||||||
|
@ -3,7 +3,7 @@ title: パケットキャプチャリング
|
|||||||
---
|
---
|
||||||
|
|
||||||
- macOS Mojave では、Wi-Fi をオフにしていないと Monitor モードでスキャンしてもパケットが受信できない。
|
- macOS Mojave では、Wi-Fi をオフにしていないと Monitor モードでスキャンしてもパケットが受信できない。
|
||||||
- Preferences > Protocols > IEEE 802.11 で Decrypt Keys を保存する。wpa-psk でハッシュ化された値を保存したほうが安全である。保存先は`.config/wireshark`
|
- Preferences > Protocols > IEEE 802.11 で Decrypt Keys を保存する。しかし wpa-psk でハッシュ化された値を保存したほうが安全だ。保存先は`.config/wireshark`
|
||||||
- 暗号化された 802.11 通信を覗くには 4-ways handshake (EAPOL)を観測する必要がある。そのためには対象デバイスの Wi-Fi をトグルすれば良い。
|
- 暗号化された 802.11 通信を覗くには 4-ways handshake (EAPOL)を観測する必要がある。そのためには対象デバイスの Wi-Fi をトグルすれば良い。
|
||||||
|
|
||||||
## コマンド
|
## コマンド
|
||||||
|
@ -1,21 +0,0 @@
|
|||||||
---
|
|
||||||
title: "[].map(parseInt)"
|
|
||||||
---
|
|
||||||
|
|
||||||
## Fan fact
|
|
||||||
|
|
||||||
`[0xa, 0xa, 0xa].map(parseInt)` results in `[10, NaN, 2]`.
|
|
||||||
|
|
||||||
## Why???
|
|
||||||
|
|
||||||
`parseInt(0xa, 0, [0xa, 0xa, 0xa])`
|
|
||||||
|
|
||||||
The second argument is `0` so the first argument gonna be treated as decimal number becoming `10`.
|
|
||||||
|
|
||||||
`parseInt(0xa, 1, [0xa, 0xa, 0xa])`
|
|
||||||
|
|
||||||
The second argument is `1` which is invalid as a radix so the result ends up with `NaN`.
|
|
||||||
|
|
||||||
`parseInt(0xa, 2, [0xa, 0xa, 0xa])`
|
|
||||||
|
|
||||||
The second argument is `2` meaning the first argument going to be handled as a binary number. `0xa` is `10` in binary, which results in `2` in decimal form.
|
|
@ -1,14 +0,0 @@
|
|||||||
---
|
|
||||||
date: 2020-02-13 16:22:05 +0900
|
|
||||||
title: 静寂を得る方法
|
|
||||||
---
|
|
||||||
|
|
||||||
聴覚過敏であったり、そうでなくとも周りの音がパフォーマンスに悪影響を与える人のために、静寂を得る方法を紹介します。
|
|
||||||
|
|
||||||
## EARIN M-2
|
|
||||||
|
|
||||||
[EARIN](https://earin.com/) は左右分離型 Bluetooth イヤホンです。付属のイヤホンの代わりに自分の耳にフィットする Comply のイヤーチップと付け替えます。
|
|
||||||
|
|
||||||
## Moldex
|
|
||||||
|
|
||||||
Moldex は使い捨て耳栓のメーカーであり、各種遮音レベルに分かれた多様なラインナップを提供しています。
|
|
@ -1,21 +1,21 @@
|
|||||||
---
|
---
|
||||||
title: Toxicity Analysis in Vtuber Live Chat
|
title: Toxicity Analysis in YouTube Live Chat
|
||||||
---
|
---
|
||||||
|
|
||||||
A little exploration and experiment on classifying toxic comments.
|
A little exploration and experiment on toxic activities.
|
||||||
|
|
||||||
# Why
|
# Why
|
||||||
|
|
||||||
The motivation is simple; I just feel sad when they look suffered from toxic comments in live chats. The goal is also simple: design an automated system to spot toxic comments and destroy them.
|
The motivation is quite simple; I just feel sad when they sound suffered from toxic chats. The goal is also simple: design an automated system to spot toxic chat and quarantine them.
|
||||||
|
|
||||||
# Data Data Data
|
# Data, Data, Data
|
||||||
|
|
||||||
> I can't make bricks without clay.
|
> I can't make bricks without clay.
|
||||||
> — Sherlock Holmes
|
> — Sherlock Holmes
|
||||||
|
|
||||||
I need a myriad of live chat comments and moderation events for analysis and future use.
|
I need a myriad of live chat comments and moderation events for analysis and future use.
|
||||||
|
|
||||||
Unfortunately, YouTube API does not offer a way to retrieve these kind of events in real time. Which is so crucial because live streams are only place we can observe moderators' activities through API response. Once it gets archived, these events are no longer available.
|
Unfortunately, YouTube API does not offer a way to retrieve these kinds of events in real time. Which is so crucial because live streams are only place we can observe moderators' activities through API response. Once it gets archived, these events are no longer available.
|
||||||
|
|
||||||
## Collecting Crusts
|
## Collecting Crusts
|
||||||
|
|
||||||
@ -29,17 +29,17 @@ collector <videoId>
|
|||||||
|
|
||||||
A line with white text is a normal chat, with red text is a ban event, with yellow text is a deletion event.
|
A line with white text is a normal chat, with red text is a ban event, with yellow text is a deletion event.
|
||||||
|
|
||||||
## Make the Bread Rise
|
## Make a Bread Rise
|
||||||
|
|
||||||
I know, that's not scalable at all. A new live stream comes in, I copy and paste video id into the terminal and run the script. How sophisticated.
|
I know, that's not scalable at all. A new live stream comes in, I copy and paste video id into the terminal and run the script. How sophisticated.
|
||||||
|
|
||||||
Thankfully, there's a fantastic service around Hololive community: [Holotools](https://hololive.jetri.co). They operates an API that gives us past, ongoing, and upcoming live streams from Hololive talents.
|
Thankfully, there's a great web service around Hololive community: [Holotools](https://hololive.jetri.co). They operate an API that gives us an index of past, ongoing, and upcoming live streams from Hololive talents.
|
||||||
|
|
||||||
Here I divided my system into two components: watch tower and collection worker. Watch tower periodically checks for newly scheduled live streams through Holotools API and create a job to be handled by workers. Collection workers are responsible for handling jobs and spawning a process to collect live chat events.
|
Here I divided my system into two components: Scheduler and workers. Scheduler periodically checks for newly scheduled live streams through Holotools API and create a job to be handled by workers. Workers are responsible for handling jobs and spawning a process to collect live chat events.
|
||||||
|
|
||||||

|

|
||||||
|
|
||||||
I run the cluster for a while and by far it collects approximately 1 million comments per day. Now I could reliably run my own bakery.
|
I run the cluster for a while and by far it hoards approximately 1 million comments per day. Now I could reliably run my own bakery.
|
||||||
|
|
||||||
# Look Before You Leap
|
# Look Before You Leap
|
||||||
|
|
||||||
@ -51,23 +51,23 @@ Okay take a close look at the data before actually starting to build a model.
|
|||||||
|
|
||||||
## By language
|
## By language
|
||||||
|
|
||||||
# Making Dataset
|
# Creating Dataset
|
||||||
|
|
||||||
## Labelling Spam & Toxic Chat
|
## Labelling Spam & Toxic Chat
|
||||||
|
|
||||||
### Utilizing Moderators' Activities
|
### Utilizing Moderators' Activities
|
||||||
|
|
||||||
### Introducing Balanced Collocation Entropy
|
### Introducing Normalized Co-occurrence Entropy
|
||||||
|
|
||||||
$$
|
$$
|
||||||
BCE(T) = \frac{N_T}{RLE_{string}(BWT(T))}
|
NCE(T) = \frac{N_T}{RLE_{string}(BWT(T))}
|
||||||
$$
|
$$
|
||||||
|
|
||||||
$$
|
$$
|
||||||
BWT[T,i] = \begin{cases} T[SA[i]-1], & \text{if }SA[i] > 0\\ \$, & \text{otherwise}\end{cases}
|
BWT[T,i] = \begin{cases} T[SA[i]-1], & \text{if }SA[i] > 0\\ \$, & \text{otherwise}\end{cases}
|
||||||
$$
|
$$
|
||||||
|
|
||||||
Shannon Entropy is not enough. So I decided to combine the ideas of [Burrows-Wheeler Transform](https://en.wikipedia.org/wiki/Burrows%E2%80%93Wheeler_transform) and [Run-length Encoding](https://en.wikipedia.org/wiki/Run-length_encoding) and create a new entropy which represents "spamness" better than Shannon entropy does.
|
Shannon Entropy is not enough. So I combined the ideas of [Burrows-Wheeler Transform](https://en.wikipedia.org/wiki/Burrows%E2%80%93Wheeler_transform) and [Run-length Encoding](https://en.wikipedia.org/wiki/Run-length_encoding) to formulate a new entropy which represents "spamminess" of given text.
|
||||||
|
|
||||||
### Browser Extension
|
### Browser Extension
|
||||||
|
|
||||||
@ -85,17 +85,4 @@ Here's a [t-SNE](https://en.wikipedia.org/wiki/T-distributed_stochastic_neighbor
|
|||||||
|
|
||||||
# Future
|
# Future
|
||||||
|
|
||||||
# Omake
|
When it's ready, I'm going to publish a dataset and pre-trained model used in this experiment.
|
||||||
|
|
||||||
## Hololive Dataset
|
|
||||||
|
|
||||||
I made collected chat events publicly available for those interested in further research.
|
|
||||||
|
|
||||||
The dataset contains:
|
|
||||||
|
|
||||||
- Chats
|
|
||||||
- Superchats (amount, currency)
|
|
||||||
- Retraction events
|
|
||||||
- Moderation events (ban, delete)
|
|
||||||
|
|
||||||
## Toxicity Estimator Pre-trained Model
|
|
||||||
|
@ -1,40 +0,0 @@
|
|||||||
---
|
|
||||||
title: 新しい自鯖
|
|
||||||
---
|
|
||||||
|
|
||||||
10年ぶりに新しいサーバーを調達しました。最後に自鯖を組んだ時は高校生、沖縄に住んでいた頃です。BTOのタワーPCにDebianを入れてWebサーバーにしていました。UPSとか無いので台風で停電するたびにWebサービスが落ちるヘボ感です。
|
|
||||||
|
|
||||||
今回も完成品を買ってしまえばそれでお終いですが折角ですし、なにより面白そうなのでパーツからサーバーを組み立てましょう。初めてのAMD、初めてのDDR4メモリ、初めてのNVM Expressです。
|
|
||||||
|
|
||||||
# スペック
|
|
||||||
|
|
||||||
用途を考えましょう。
|
|
||||||
|
|
||||||
- 機械学習サーバー
|
|
||||||
- 自宅クラウド
|
|
||||||
- メールサーバー
|
|
||||||
- ファイルサーバー (Nextcloud)
|
|
||||||
- VPNサーバー他
|
|
||||||
- VS Code Remote SSHのホスト先
|
|
||||||
- 重いmakeやらなんやら
|
|
||||||
- TabNine
|
|
||||||
- Webサーバー
|
|
||||||
- WebアプリやTelegram botのデプロイ先
|
|
||||||
|
|
||||||
重いタスクを並列してやらせたいので最優先はCPUとメモリです。メモリはデュアルリンクを重視して32GBx2を、CPUは昨今のライブラリのマルチコア対応を勘案してRyzen 9 3950Xにしました。
|
|
||||||
|
|
||||||
> 結果から言うとメモリはもっと必要でした。巨大なPandasデータフレームを並列処理なんかするとサクッと消えてしまいます。予算に余裕があるなら128GBほど用意したほうが良いです。
|
|
||||||
|
|
||||||
GPUは古いサーバーに突っ込んでいたNVIDIA GeForce GTX TITAN X (Maxwell)を流用しました。メモリが12GBありますが、最大ワークロード時でも5GBは残るので今のところ十分です。
|
|
||||||
|
|
||||||
記憶装置は3TB HDD 2台と500GB NVMeメモリです。NVMeメモリはOS用、HDDはデータとバックアップ用です。
|
|
||||||
|
|
||||||
マザーボードはASRockのB550 Taichiです。X570マザーと比較して、実装されているコンデンサーや安定性でB550にしました。
|
|
||||||
|
|
||||||
今後GPUを追加することを考えて800W電源を選びました。実際にサーバーを稼働させて使用電力を計測してみると、アイドル時に180W前後、フル稼働時でも350Wを超えないくらいでした。今後UPSを買う場合はその付近+バッファのグレードを買うと良いかもしれません。
|
|
||||||
|
|
||||||
ケースはFractal DesignのMeshify 2にしました。シンプルで良い。
|
|
||||||
|
|
||||||
OSは長年親しんできたDebian系を卒業してArchlinuxにしてみました。すでにファンになりかけています。本当に何も用意してくれません。セットアップウィザードとかないです。いきなりシングルユーザーモードにぶち込まれます。`which`すらインストールしなければ使えません。潔癖症の自覚がある人はArchを使いましょう。あとAURにいくつかパッケージを公開してみたので、よければvoteお願いします。
|
|
||||||
|
|
||||||
Arch Linuxのセットアップは個別に記事を書いたので読んでください。入力したコマンドを全て記録したので再現性があります。
|
|
153
source/_posts/2021/affinity-thumbnail.md
Normal file
153
source/_posts/2021/affinity-thumbnail.md
Normal file
@ -0,0 +1,153 @@
|
|||||||
|
---
|
||||||
|
title: Distill Thumbnail from .afphoto and .afdesign
|
||||||
|
date: 2021-02-14T13:30:00
|
||||||
|
---
|
||||||
|
|
||||||
|
Nextcloud does not have support for generating thumbnails from Affinity Photo and Affinity Design. Fine, I'll do it myself.
|
||||||
|
|
||||||
|
# Digging Binary
|
||||||
|
|
||||||
|
Glancing at `.afphoto` and `.afdesign` in Finder, I noticed that it has a QuickLook support and an ability to show the thumbnail image. So these files should have thumbnail image somewhere inside its binary.
|
||||||
|
|
||||||
|
I wrote a simple script to seek for thumbnail image inside a binary and save it as a PNG file.
|
||||||
|
|
||||||
|
```js
|
||||||
|
const fs = require("fs");
|
||||||
|
|
||||||
|
// png spec: https://www.w3.org/TR/PNG/
|
||||||
|
const PNG_SIG = Buffer.from([137, 80, 78, 71, 13, 10, 26, 10]);
|
||||||
|
const IEND_SIG = Buffer.from([73, 69, 78, 68]);
|
||||||
|
|
||||||
|
function extractThumbnail(buf) {
|
||||||
|
const start = buf.indexOf(PNG_SIG);
|
||||||
|
const end = buf.indexOf(IEND_SIG, start) + IEND_SIG.length * 2; // IEND + CRC
|
||||||
|
return buf.subarray(start, end);
|
||||||
|
}
|
||||||
|
|
||||||
|
function generateThumbnail(input, output) {
|
||||||
|
const buf = fs.readFileSync(input);
|
||||||
|
const thumbBuf = extractThumbnail(buf);
|
||||||
|
fs.writeFileSync(output, thumbBuf);
|
||||||
|
}
|
||||||
|
|
||||||
|
generateThumbnail(process.argv[2], process.argv[3] || "output.png");
|
||||||
|
```
|
||||||
|
|
||||||
|
That's right. This script just scrapes a binary file and distill a portion of which starts with `PNG` signature and ends with `IEND`.
|
||||||
|
|
||||||
|
Now I can generate a thumbnail image from arbitrary `.afphoto` and `.afdesign` file. Let's move on delving into Nextcloud source code.
|
||||||
|
|
||||||
|
# Tweaking Nextcloud
|
||||||
|
|
||||||
|
I have a little experience in tweaking Nextcloud source code before, where I implemented thumbnail generator for PDFs, so it should be easier this time, hopefully.
|
||||||
|
|
||||||
|
Long story short, I got Nextcloud generates thumbnail images for Affinity files by implementing `ProviderV2` class.
|
||||||
|
|
||||||
|
```php lib/private/Preview/Affinity.php
|
||||||
|
<?php
|
||||||
|
|
||||||
|
namespace OC\Preview;
|
||||||
|
|
||||||
|
use OCP\Files\File;
|
||||||
|
use OCP\IImage;
|
||||||
|
use OCP\ILogger;
|
||||||
|
|
||||||
|
class Affinity extends ProviderV2 {
|
||||||
|
public function getMimeType(): string {
|
||||||
|
return '/application\/x-affinity-(?:photo|design)/';
|
||||||
|
}
|
||||||
|
|
||||||
|
public function getThumbnail(File $file, int $maxX, int $maxY): ?IImage {
|
||||||
|
$tmpPath = $this->getLocalFile($file);
|
||||||
|
|
||||||
|
$handle = fopen($tmpPath, 'rb');
|
||||||
|
$fsize = filesize($tmpPath);
|
||||||
|
$contents = fread($handle, $fsize);
|
||||||
|
$start = strrpos($contents, "\x89PNG");
|
||||||
|
$end = strrpos($contents, "IEND", $start);
|
||||||
|
$subarr = substr($contents, $start, $end - $start + 8 );
|
||||||
|
|
||||||
|
fclose($handle);
|
||||||
|
$this->cleanTmpFiles();
|
||||||
|
|
||||||
|
$image = new \OC_Image();
|
||||||
|
$image->loadFromData($subarr);
|
||||||
|
$image->scaleDownToFit($maxX, $maxY);
|
||||||
|
|
||||||
|
return $image->valid() ? $image : null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
```patch lib/private/PreviewManager.php
|
||||||
|
@@ -363,6 +365,8 @@
|
||||||
|
$this->registerCoreProvider(Preview\Krita::class, '/application\/x-krita/');
|
||||||
|
$this->registerCoreProvider(Preview\MP3::class, '/audio\/mpeg/');
|
||||||
|
$this->registerCoreProvider(Preview\OpenDocument::class, '/application\/vnd.oasis.opendocument.*/');
|
||||||
|
+ $this->registerCoreProvider(Preview\Affinity::class, '/application\/x-affinity-(?:photo|design)/');
|
||||||
|
|
||||||
|
// SVG, Office and Bitmap require imagick
|
||||||
|
if (extension_loaded('imagick')) {
|
||||||
|
```
|
||||||
|
|
||||||
|
```patch lib/composer/composer/autoload_static.php
|
||||||
|
@@ -1226,6 +1226,7 @@
|
||||||
|
'OC\\OCS\\Result' => __DIR__ . '/../../..' . '/lib/private/OCS/Result.php',
|
||||||
|
'OC\\PreviewManager' => __DIR__ . '/../../..' . '/lib/private/PreviewManager.php',
|
||||||
|
'OC\\PreviewNotAvailableException' => __DIR__ . '/../../..' . '/lib/private/PreviewNotAvailableException.php',
|
||||||
|
+ 'OC\\Preview\\Affinity' => __DIR__ . '/../../..' . '/lib/private/Preview/Affinity.php',
|
||||||
|
'OC\\Preview\\BMP' => __DIR__ . '/../../..' . '/lib/private/Preview/BMP.php',
|
||||||
|
'OC\\Preview\\BackgroundCleanupJob' => __DIR__ . '/../../..' . '/lib/private/Preview/BackgroundCleanupJob.php',
|
||||||
|
'OC\\Preview\\Bitmap' => __DIR__ . '/../../..' . '/lib/private/Preview/Bitmap.php',
|
||||||
|
```
|
||||||
|
|
||||||
|
```patch lib/composer/composer/autoload_classmap.php
|
||||||
|
@@ -1197,6 +1197,7 @@
|
||||||
|
'OC\\OCS\\Result' => $baseDir . '/lib/private/OCS/Result.php',
|
||||||
|
'OC\\PreviewManager' => $baseDir . '/lib/private/PreviewManager.php',
|
||||||
|
'OC\\PreviewNotAvailableException' => $baseDir . '/lib/private/PreviewNotAvailableException.php',
|
||||||
|
+ 'OC\\Preview\\Affinity' => $baseDir . '/lib/private/Preview/Affinity.php',
|
||||||
|
'OC\\Preview\\BMP' => $baseDir . '/lib/private/Preview/BMP.php',
|
||||||
|
'OC\\Preview\\BackgroundCleanupJob' => $baseDir . '/lib/private/Preview/BackgroundCleanupJob.php',
|
||||||
|
'OC\\Preview\\Bitmap' => $baseDir . '/lib/private/Preview/Bitmap.php',
|
||||||
|
```
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||
|
It works!
|
||||||
|
|
||||||
|
# Bonus: PDF thumbnail generator
|
||||||
|
|
||||||
|
```php lib/private/Preview/PDF.php
|
||||||
|
<?php
|
||||||
|
|
||||||
|
namespace OC\Preview;
|
||||||
|
|
||||||
|
use OCP\Files\File;
|
||||||
|
use OCP\IImage;
|
||||||
|
|
||||||
|
class PDF extends ProviderV2 {
|
||||||
|
public function getMimeType(): string {
|
||||||
|
return '/application\/pdf/';
|
||||||
|
}
|
||||||
|
|
||||||
|
public function getThumbnail(File $file, int $maxX, int $maxY): ?IImage {
|
||||||
|
$tmpPath = $this->getLocalFile($file);
|
||||||
|
$outputPath = \OC::$server->getTempManager()->getTemporaryFile();
|
||||||
|
|
||||||
|
$gsBin = \OC_Helper::findBinaryPath('gs');
|
||||||
|
$cmd = $gsBin . " -o " . escapeshellarg($outputPath) . " -sDEVICE=jpeg -sPAPERSIZE=a4 -dLastPage=1 -dPDFFitPage -dJPEGQ=90 -r144 " . escapeshellarg($tmpPath);
|
||||||
|
shell_exec($cmd);
|
||||||
|
|
||||||
|
$this->cleanTmpFiles();
|
||||||
|
|
||||||
|
$image = new \OC_Image();
|
||||||
|
$image->loadFromFile($outputPath);
|
||||||
|
$image->scaleDownToFit($maxX, $maxY);
|
||||||
|
|
||||||
|
unlink($outputPath);
|
||||||
|
|
||||||
|
return $image->valid() ? $image : null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
Before Width: | Height: | Size: 437 KiB After Width: | Height: | Size: 437 KiB |
@ -1,6 +1,6 @@
|
|||||||
---
|
---
|
||||||
title: 点字の表現力
|
title: 点字の表現力
|
||||||
date: 2021-02-13
|
date: 2021-02-13T01:00:00
|
||||||
---
|
---
|
||||||
|
|
||||||
「n 種類の文字を表現できる点字を作らないといけなくなったらどうしよう」
|
「n 種類の文字を表現できる点字を作らないといけなくなったらどうしよう」
|
||||||
@ -11,7 +11,7 @@ date: 2021-02-13
|
|||||||
- $f(N,K) = \sum_{i=K → 0} N P K$
|
- $f(N,K) = \sum_{i=K → 0} N P K$
|
||||||
- 2 の n 乗っぽい
|
- 2 の n 乗っぽい
|
||||||
- べき集合の濃度?
|
- べき集合の濃度?
|
||||||
- 2 進数のブール配列と考えるとわかりやすくなった
|
- 2 進数のブール配列と考えればわかりやすくなった
|
||||||
- ということは X 個の表現をするために必要なブール配列の長さは $\lceil\log_2 (X)\rceil$
|
- ということは X 個の表現をするために必要なブール配列の長さは $\lceil\log_2 (X)\rceil$
|
||||||
- 例えばアルファベットなら 6 ドットの点字で表現できる
|
- 例えばアルファベットなら 6 ドットの点字で表現できる
|
||||||
- だから英語の点字は 6 つの点
|
- だから英語の点字は 6 つの点
|
||||||
|
@ -1,9 +1,9 @@
|
|||||||
---
|
---
|
||||||
title: Arch Linux Setup Guide
|
title: Installing Arch Linux
|
||||||
date: 2021-02-12
|
date: 2021-02-12
|
||||||
---
|
---
|
||||||
|
|
||||||
This note includes all commands I typed when I setup Arch Linux on my new baremetal server.
|
This note includes all commands I typed when I set up Arch Linux on my new bare metal server.
|
||||||
|
|
||||||
# Why I choose Arch Linux
|
# Why I choose Arch Linux
|
||||||
|
|
||||||
@ -26,7 +26,7 @@ This note includes all commands I typed when I setup Arch Linux on my new bareme
|
|||||||
wipefs -a /dev/sda
|
wipefs -a /dev/sda
|
||||||
```
|
```
|
||||||
|
|
||||||
## create parition
|
## create partition
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
parted
|
parted
|
||||||
@ -54,7 +54,7 @@ mount /dev/sda2 /mnt
|
|||||||
mount /dev/sda1 /mnt/boot
|
mount /dev/sda1 /mnt/boot
|
||||||
```
|
```
|
||||||
|
|
||||||
## install base & linux kernel
|
## install base & Linux kernel
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
reflector -f 10 --latest 30 --protocol https --sort rate --save /etc/pacman.d/mirrorlist # optimize mirror list
|
reflector -f 10 --latest 30 --protocol https --sort rate --save /etc/pacman.d/mirrorlist # optimize mirror list
|
||||||
@ -83,7 +83,7 @@ grub-install --target=x86_64-efi --efi-directory=/boot --bootloader-id=GRUB
|
|||||||
grub-mkconfig -o /boot/grub/grub.cfg
|
grub-mkconfig -o /boot/grub/grub.cfg
|
||||||
```
|
```
|
||||||
|
|
||||||
## ntp
|
## NTP
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
sed -i -e 's/#NTP=/NTP=0.arch.pool.ntp.org 1.arch.pool.ntp.org 2.arch.pool.ntp.org 3.arch.pool.ntp.org/' -e 's/#Fall/Fall/' /etc/systemd/timesyncd.conf
|
sed -i -e 's/#NTP=/NTP=0.arch.pool.ntp.org 1.arch.pool.ntp.org 2.arch.pool.ntp.org 3.arch.pool.ntp.org/' -e 's/#Fall/Fall/' /etc/systemd/timesyncd.conf
|
||||||
@ -252,7 +252,7 @@ reboot
|
|||||||
|
|
||||||
# Additional setup
|
# Additional setup
|
||||||
|
|
||||||
## gpgpu
|
## GPGPU
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
pacman -S nvidia
|
pacman -S nvidia
|
||||||
@ -297,7 +297,7 @@ usermod -aG docker user
|
|||||||
docker run --rm -it --gpus all nvidia/cuda:10.2-cudnn7-runtime
|
docker run --rm -it --gpus all nvidia/cuda:10.2-cudnn7-runtime
|
||||||
```
|
```
|
||||||
|
|
||||||
## telegraf
|
## Telegraf
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
yay -S telegraf
|
yay -S telegraf
|
||||||
@ -486,7 +486,7 @@ ln -sf /etc/backups/borg.* /etc/systemd/system/
|
|||||||
systemctl enable --now borg
|
systemctl enable --now borg
|
||||||
```
|
```
|
||||||
|
|
||||||
## kubernetes
|
## Kubernetes
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
pacman -S kubeadm kubelet kubectl
|
pacman -S kubeadm kubelet kubectl
|
@ -99,4 +99,4 @@ JWT は JSON を利用して Assertion を生成するための仕様。
|
|||||||
|
|
||||||
2019 年 7 月
|
2019 年 7 月
|
||||||
|
|
||||||
リソースサーバーに渡す Access Token に JWT を使用することを定めている。
|
リソースサーバーへ渡す Access Token に JWT を使用することを定めている。
|
||||||
|
26
source/_posts/2021/parseint-magic.md
Normal file
26
source/_posts/2021/parseint-magic.md
Normal file
@ -0,0 +1,26 @@
|
|||||||
|
---
|
||||||
|
title: "[].map(parseInt)"
|
||||||
|
date: 2021-02-14T11:30:00
|
||||||
|
---
|
||||||
|
|
||||||
|
Fun fact: `[0xa, 0xa, 0xa].map(parseInt)` yields `[10, NaN, 2]`.
|
||||||
|
|
||||||
|
# Why
|
||||||
|
|
||||||
|
```js
|
||||||
|
parseInt(0xa, 0, [0xa, 0xa, 0xa]);
|
||||||
|
```
|
||||||
|
|
||||||
|
The second argument is `0` so the first argument going to be treated as decimal number becoming `10`.
|
||||||
|
|
||||||
|
```js
|
||||||
|
parseInt(0xa, 1, [0xa, 0xa, 0xa]);
|
||||||
|
```
|
||||||
|
|
||||||
|
The second argument is `1` which is invalid as a radix, so the result ends up with `NaN`.
|
||||||
|
|
||||||
|
```js
|
||||||
|
parseInt(0xa, 2, [0xa, 0xa, 0xa]);
|
||||||
|
```
|
||||||
|
|
||||||
|
The second argument is `2` meaning the first argument going to be handled as a binary number. `0xa` is `10` in binary, which results in `2` in decimal form.
|
38
source/_posts/2021/server-2020.md
Normal file
38
source/_posts/2021/server-2020.md
Normal file
@ -0,0 +1,38 @@
|
|||||||
|
---
|
||||||
|
title: 新しい自宅サーバーの構成
|
||||||
|
date: 2021-02-13T00:00:00
|
||||||
|
---
|
||||||
|
|
||||||
|
10 年ぶりにサーバーを更新しました。初めての AMD、初めての DDR4、初めての NVM Express です。
|
||||||
|
|
||||||
|
# 用途とスペック
|
||||||
|
|
||||||
|
- セルフホスト (Docker)
|
||||||
|
- メールサーバー
|
||||||
|
- DNS サーバー
|
||||||
|
- Nextcloud
|
||||||
|
- TimeMachine
|
||||||
|
- GitLab
|
||||||
|
- LanguageTool
|
||||||
|
- VPN 他
|
||||||
|
- 計算実験
|
||||||
|
- Web サーバー
|
||||||
|
- VS Code Remote SSH のホスト先
|
||||||
|
|
||||||
|
重いタスクを並列してやらせたいので最優先は CPU とメモリです。メモリはデュアルリンクを重視して 32GBx2 を、CPU は昨今のライブラリのマルチコア対応を勘案して Ryzen 9 3950X にしました。
|
||||||
|
|
||||||
|
> 結果から言うとメモリはもっと必要でした。巨大な Pandas データフレームを並列処理なんかするとサクッと消えてしまいます。予算に余裕があるなら 128GB ほど用意したほうが良いです。
|
||||||
|
|
||||||
|
GPU は古いサーバーに突っ込んでいた NVIDIA GeForce GTX TITAN X (Maxwell)を流用しました。グラフィックメモリが 12GB ですが、最大ワークロード時でも 5GB は残るので今のところ十分です。
|
||||||
|
|
||||||
|
記憶装置は 3TB HDD 2 台と 500GB NVMe、そして古いサーバーから引っこ抜いた 500GB SSD です。NVMe メモリは OS 用、SSD/HDD はデータとバックアップ用にしました。
|
||||||
|
|
||||||
|
マザーボードは X570 と比較して、実装されているコンデンサーやパーツがサーバー向きだと思った ASRock B550 Taichi にしました。
|
||||||
|
|
||||||
|
電源は、今後 GPU を追加することを考えて 800W 電源を選びました。実際にサーバーを稼働させながら使用電力を計測したところ、アイドル時に 180W 前後、フル稼働時でも 350W を超えない程度でした。今後 UPS を買う場合はその付近+バッファのグレードを買うと良いかもしれません。
|
||||||
|
|
||||||
|
ケースは Fractal Design の Meshify 2 です。
|
||||||
|
|
||||||
|
OS は長年付き合ってきた Ubuntu と袂を分かち、Arch Linux を選びました。ミニマルなところが好きです。本当に何も用意してくれません。セットアップウィザードとかないです。`which`すらインストールしなければ使えません。
|
||||||
|
|
||||||
|
Arch Linux のセットアップは[個別に記事](https://uechi.io/blog/installing-arch-linux/)を書いたので読んでください。入力したコマンドを全て記録しました。
|
@ -1,10 +1,10 @@
|
|||||||
---
|
---
|
||||||
title: 最小送金回数で精算する割り勘アルゴリズム
|
title: 最小送金回数で精算する割り勘アルゴリズム
|
||||||
date: 2021-02-14
|
date: 2021-02-14T00:00:00
|
||||||
---
|
---
|
||||||
|
|
||||||
大人数でキャンプを楽しんだあとに待っているのは耐え難き送金処理です。
|
大人数でキャンプを楽しんだあとに待っているのは耐え難き送金処理です。
|
||||||
次回から楽をするために、送金回数を最小化する制約で精算表を作る方法を考えてみます。
|
次回から楽をするためにも、送金回数を最小化する制約で精算表を作る方法を考えましょう。
|
||||||
|
|
||||||
# tl;dr
|
# tl;dr
|
||||||
|
|
||||||
@ -145,4 +145,4 @@ B virtually paid ¥81 in total
|
|||||||
C virtually paid ¥76 in total
|
C virtually paid ¥76 in total
|
||||||
```
|
```
|
||||||
|
|
||||||
プログラムに落とし込めたら、あとはアプリを作るなりスプレッドシートのマクロにするなり自由です。面倒なことは全部コンピューターにやらせよう!
|
プログラムに落とし込むことができたら、あとはスプレッドシートのマクロにするなり自由です。面倒なことは全部コンピューターにやらせよう。
|
||||||
|
Loading…
x
Reference in New Issue
Block a user